All Episodes

April 30, 2024 43 mins

Daniel gets educated on artificial intelligence by UCLA computer science Professor Guy Van den Broeck. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Just raise your hand. Once you realize that you're that
you know that you're talking to a complete idiot. Pasha
Tosh Show us. Welcome to Toss Show. If you're proud

(00:21):
to be Latino, stand the fuck up. Go ahead and
hit me with some of that three oh five music.
All right, it's a good start, you know, Eddie. I
have heard through the grapevine that some people don't like

(00:46):
the production value of our theme song. Now, I may
shock a lot of you listeners out there, but we
did not spend a lot of money on the music
for Toss Show. But I am open to new suggestions,
but not my uh traditional musicians. No, no, no, this podcast

(01:07):
is the future of entertainment. So what better than AI
to create our potential new theme song? Right okay, I
I hear we've got a few candidates to audition. Let
me go ahead and hear this first one here.

Speaker 2 (01:23):
You get ready stands?

Speaker 1 (01:36):
That sounds like a train wreck. I don't like that
one bit.

Speaker 3 (01:40):
It's probably the prompt.

Speaker 1 (01:41):
No, no, there's too many words to listen to. All Right,
we got another.

Speaker 2 (01:46):
One, yeah, the Barca.

Speaker 1 (01:54):
Lasting we want to know. Now, that's got some energy.
What I worry about is people listening to this thing
first thing in the morning, and they're not ready for
that nickelback rage. That might be too much. We might
be getting people spilling their coffee. Then we've got a

(02:15):
lawsuit on our hand. We have any more?

Speaker 3 (02:17):
Here's play's play one more?

Speaker 1 (02:19):
All right? Let me hear another one? Welcome today.

Speaker 2 (02:24):
Here just every day.

Speaker 1 (02:30):
God, that's awful. Here's the thing. AI is not ready.
I say, we stick with what we've got. That's right. Oh, man,
I worry about Ai every morning. This is the first
thing I do. Is AI sleeping with my wife? Is

(02:51):
AI diddling my kids? Is what I worry about. It's
just NonStop. Oh but what's AI doing? I was taking
my dog for a walk that I can get my
head around, that I would enjoy. Now I'm told, thanks
to AI, that this podcast is now broadcast in over

(03:14):
seven thousand languages. And I'm gonna be honest with you,
I didn't know there were seven thousand languages. Let's see
what Toss show sounds like in Spanish, Eddie, it's the programs.
Now that podcast is no longer under the category of comedy.

(03:34):
That one's under sexy. This is why you can't ever
be convicted of a crime in this country because you
just be like, I didn't do that. That was ai,
Like I never said those words. I don't speak fluent,
beautiful Spanish. Do you want to hear me speak Chinese? Wherever?
I try to ease this, Sir Lien Eddie transorg GM Shirsanger,

(03:58):
Oh man, what I wouldn't get to be able to
speak whatever beautiful language that was? Speaking of Chinese? I
just finished watching a showgun which I know is Japanese
or is They like to say the Japan's but it's
all under the Asian umbrella, which they like to hold.

(04:19):
A lot of them like to use umbrellas. This is
what I wanted. I'm not there's no spoilers. I'm not
giving any spoilers. Well I will, but it's just just
episode one that I'm going to talk about. And I
understand that this is set in that I think the
sixteen hundreds or something like that, medieval period, and it's
a different culture, okay, But I like to just play

(04:42):
it out as if it happened in my life today,
Like how would that go over? I just liked the
idea of me coming home from work. So Eddie, you
can play the role of my wife for this and
you just say, how was your day, honey?

Speaker 3 (05:00):
How is your day?

Speaker 2 (05:00):
Honey?

Speaker 1 (05:01):
Oh funny you should ask you know how I like
to uh smart off sometimes at work. Well, I I
spoke up when I wasn't supposed to. Anyway, We're gonna
have to kill our children and I have to kill myself.
Oh so yeah, all right? Well, uh, speaking of AI,

(05:32):
today's guest a genius, a certified genius, a professor at
the University of California, Los Angeles. This guy knows everything
about AI. He's gonna calm my nerves hopefully, or he's
going to create new worries for me at night. Also,
I have to I have to mention that this was

(05:54):
the interview that was recorded that day that Dylan had
a colossal fuck cup a brain shart heard around the world.
So there were some audio issues, But the irony is
that Dylan figured out how to use AI to fix

(06:15):
his blundering buffoonery. Enjoy Pasha, my guest today, is the
smartest person we've had on the show, and not just
because he's a foreigner with four words in his name
who also happens to have facial hair and wears glasses.
Please welcome our distinguished guest, AI expert in the UCLA professor.

(06:39):
He doctor, he professor, he mister he.

Speaker 4 (06:43):
What do you want?

Speaker 3 (06:44):
It's all good, just he is fine.

Speaker 4 (06:45):
Thank you all right?

Speaker 3 (06:46):
He?

Speaker 1 (06:46):
Where are you from? I'm from Belgium, Belgium, Well you've
been here.

Speaker 3 (06:50):
I moved here in twenty fifteen, citizen, not yet soon
maybe next year.

Speaker 4 (06:55):
Are you're actually going to do it?

Speaker 3 (06:57):
I think I'm eligible in like a month.

Speaker 1 (06:58):
So congratulations, thank you welcome. Do you like America?

Speaker 3 (07:04):
Yeah? I mean it gave me a lot of opportunities,
so yeah, I love it here.

Speaker 1 (07:07):
How often do you go back?

Speaker 3 (07:08):
Maybe once or twice a year. It's a long track, yeah, yeah,
especially with a toddler. It's not so nice to travel
for like thirteen hours.

Speaker 4 (07:15):
How old you're toddler?

Speaker 3 (07:16):
Three years old?

Speaker 1 (07:17):
And now are you as an academic person? Do you still?
I mean, can you relate to a toddler? Do ye
act silly? Are you a silly dad? Oh?

Speaker 3 (07:26):
Yeah?

Speaker 1 (07:27):
Are you strict? No?

Speaker 3 (07:29):
I'm quite silly. I would say. I don't really like
to be the serious professor too much. It's not my
style now.

Speaker 1 (07:34):
I mean I feel like the reason I had children
was because I wanted an excuse not to care so
much about everything else.

Speaker 3 (07:44):
Yeah, and you cannot imagine before it happens, right, It's
kind of yeah, like you cannot prepare for this, and
that's only everything changes. And you know, it's also kind
of interesting that you kind of you know, you see
them grow and like learn things, and you're like, oh, yeah,
this this is different from how AI learns. You know,
you kind of get some perspective on like what is
just from seeing a toddler learn how to walk and

(08:04):
talk and you know all that stuff.

Speaker 1 (08:06):
Right, I just like, like, whatever, the world's going to
shit and I'm gonna still have to, you know, build
this horrible lego thing that he just brought. Do you
believe in ghosts? No?

Speaker 3 (08:18):
Okay, I feel like you asked a friendshow the same question.
Is that a usual question?

Speaker 1 (08:22):
That's the first question I ask everyone? Do you believe
in ghosts?

Speaker 4 (08:25):
Not at all?

Speaker 1 (08:25):
Move on? Do you believe in digital ghost or those
death bots?

Speaker 3 (08:30):
What are those?

Speaker 1 (08:31):
Like some people that have lost someone and then they
create their like this digital ghost.

Speaker 3 (08:36):
And it's I'm sure it's comforting. It's like watching videos
and pictures from people.

Speaker 1 (08:40):
Do you think that's a good thing for the psyche
he has a way to move on.

Speaker 3 (08:43):
Or No, I think it's a bit creepy. I wouldn't
really enjoy it. I would rather watch a videos from
ten years ago.

Speaker 1 (08:48):
Yeah, yeah, that's always sad too, just to keep replaying
that video like John Wick. Uh, why did you leave Belgium,
a smart country to come a good old delmbo mare?

Speaker 3 (09:03):
Yeah, you know, honestly, no one wanted to give me
a job as a professor in Belgium, and I had
to come here as a as an academic refugee.

Speaker 4 (09:11):
What is your actual job.

Speaker 3 (09:12):
I'm a professor of computer science at UCLA and I
teach AI.

Speaker 1 (09:16):
You teach aim.

Speaker 4 (09:18):
Are you worried that your job will be taken m M?

Speaker 3 (09:21):
I mean the teaching part. I mean the more, the
more people will help teach EI, the less kind of
boring stuff I have to do, and the more interesting
things I can teach that are maybe less off the shelf.
So I would own mind more. AI help.

Speaker 1 (09:34):
How did you get into computers in the first place?

Speaker 3 (09:35):
I mean, I was a big nerd and still am,
I guess, and I like to program, I like to
play computer games, and you know, just like any other
nerd that' have I started with computers?

Speaker 1 (09:45):
Do you still play the games?

Speaker 3 (09:47):
Yes you did? Yes, Yes, I wish you hadn't asked.

Speaker 4 (09:51):
But yeah, like like you put in real time.

Speaker 3 (09:54):
I have this rule where I, you know, maybe once
a year I'll spend the weekend like binge playing for
like and without sleep, just to kind of reset myself.
And like, you know, I'm not worried too much about
being a civilized person with a real job. And you know,
I think everyone needs to do such something every once
in a while.

Speaker 1 (10:11):
Though, is there a particular game that you care about?

Speaker 3 (10:13):
Lust? When I played was? I played Civilization six? Again,
I don't know the game strategy games.

Speaker 4 (10:20):
I'm not a game person.

Speaker 1 (10:21):
And my brother, my brother is a game person and
who is a computer programmer and then created a company
for gaming and also for the government and then sold it.
And you know he's that kid very much. Yeah, a
world that you two would like to meet him, but no, he.

Speaker 4 (10:41):
He I was.

Speaker 1 (10:41):
I told him that I was interviewing and and he
was like okay, So then he added did a couple things.

Speaker 4 (10:46):
He's like, bring it, bring this up.

Speaker 1 (10:48):
It seems like self driving cars got to eighty percent
good really quickly, but progress is stalled with getting into
one hundred percent. Will we see that with AI for
programming too, like a helpful too but still need someone
to steer it, or will it quickly get to the
point that we don't need software engineers.

Speaker 3 (11:07):
Yeah, so I think it's it's kind of a little
bit of both.

Speaker 1 (11:09):
Right.

Speaker 3 (11:09):
So on the one hand, yeah, like even today, your
brother is probably already using generative AI to help him program,
and so that's definitely happening. Whether your brother will be
completely replaced, I highly doubt it. I think there's you know.

Speaker 1 (11:22):
He didn't have a job now he owns.

Speaker 3 (11:24):
He's just doing it for fun.

Speaker 1 (11:25):
He doesn't like six months cells it does something else.

Speaker 3 (11:28):
I mean those weird if it's kind of this boilerplate
stuff where it's very similar to what many people have
done before and it's just like minor tweaks two things.
And yes, AI is probably going to be able to
do that because AI is really good at like finding
similar things and kind of slightly modifying them. But if
it's actually building like new software that does new things,
that's going to be much harder for AI to achieve.

Speaker 1 (11:48):
You teach at UCLA. You think it's a bad idea
that UCLA left the fact well for the Big ten?

Speaker 4 (11:55):
And are you aware of sports?

Speaker 3 (11:58):
I heard this is a big deal in the way
in the real world. Yes, I heard about it. I
don't know. I am sure you say it only makes
good decisions like hiring me and moving to the back
twelve or whatever it is.

Speaker 1 (12:10):
Do you ever have Alonzo Ball in one of your classes?

Speaker 3 (12:12):
Who is that?

Speaker 1 (12:13):
He was a basketball player? I'm sorry, popular a few
years ago.

Speaker 3 (12:16):
We get student athletes in class all the time, so
but I don't really I don't know that.

Speaker 4 (12:21):
Are you told it? Hey?

Speaker 3 (12:23):
No, definitely not.

Speaker 1 (12:24):
Subtle debate for us here? Who is the father of
artificial intelligence? Alan Turning or John McCarthy.

Speaker 3 (12:31):
So Alan Turing was he was kind of the father
of computer science, and he already said like, let's build
a computer that can play chess. So in that sense,
during is kind of the first. But then McCarty was
the one that called it AI and really kind of
started the field of AI. So I think they put
good credits.

Speaker 4 (12:48):
Okay, well, I want you to know that there was
no debate here.

Speaker 1 (12:51):
I'll pose my question to make you feel like you
were among academics. But nothing could be first.

Speaker 3 (12:56):
I mean, your question was great except you said turning
instead of touring.

Speaker 4 (12:59):
Yeah, that was the giveaway.

Speaker 1 (13:00):
God damn it.

Speaker 3 (13:02):
Yeah, you should read these cards before.

Speaker 1 (13:04):
You think I didn't read it. Well, then you're extremely dumb.

Speaker 4 (13:07):
I stew over these things.

Speaker 1 (13:09):
By the way, Yeah, when's the last time we took
a hearing test?

Speaker 3 (13:13):
Do you think I have a problem? No?

Speaker 1 (13:16):
Never, Okay, No, it's like my kid just took a
hear It doesn't like you just have to raise your
hand and I just want you to u during this
in just raise your hand once you realize that you're
that you know that you're talking to a complete idiot.

Speaker 3 (13:38):
No, No, I don't believe in that.

Speaker 1 (13:39):
You don't believe.

Speaker 4 (13:40):
You don't believe though.

Speaker 3 (13:41):
I mean, I'm also an idiot, you know.

Speaker 4 (13:44):
No, not true.

Speaker 1 (13:45):
What are some of the real dangers of AI because
you're not a doom and gloom guy with it?

Speaker 3 (13:50):
No, I don't like all the scary stuff a guy
is going to take over in the world and like
terminator will walk into this room and all of that stuff.
I think, you know, I appreciate that some people seriously
studied is like as a long term problem of like
how do we make AI behave the way we wanted
to behave? But I think what I'm much more concerned
about is like, today AI is doing pretty bad things already,
and you know, if you care too much about this

(14:11):
kind of terminator sci fi AI taking over the world,
and I think you're also kind of ignoring the real
dangers today.

Speaker 1 (14:17):
And the real dangers are just believing everything that's put
out part.

Speaker 3 (14:21):
Of you, yeah, thinking that somehow, because AI is intelligent,
it knows how to make decisions that are good for
all of us. Well, these systems are full of biases,
and you know, AI is constantly also being used to
you know, track people, to even build kind of automated weapons.
AI is being used in many ways today that are
quite dangerous already, and that's what I'm more concerned about.

Speaker 4 (14:44):
Well, yeah, that does sound worrisome. Now now I'm back
on the doom and gloom. I thought you maybe feel
better about that.

Speaker 3 (14:50):
No, but I mean, there's drones flying around and you
can tell them find Daniel Tosh and shoot at Daniel Tosch.
This is very doable and so sorry, sorry. We can
cut if.

Speaker 1 (15:00):
I see a which if I see a drone in
my property. My instinct is to get a sling shot out,
But now I think that might not be good enough.

Speaker 3 (15:08):
Yeah, I should should move closer to an airport where
you cannot have drones.

Speaker 1 (15:11):
Just closer to an airport, yea, Oh, I'd rather be
shot at. How advanced is the cutting edge of this
technology compared to what the public is aware of? Oh?

Speaker 3 (15:22):
I think the public is aware of the cutting edge.
Anyone who has something new desperately wants to put out
a press release, show the world make money.

Speaker 1 (15:30):
Yeah, all right, so there's nothing behind the scenes that's
way scarier than what I mean.

Speaker 3 (15:34):
Some of these model stick a while to train, Like
I'm sure Open the Eye is training the next GPT
and it's probably impressive and they know it, but they're
not ready to release it yet. But as soon as
they can, I'm sure they will.

Speaker 1 (15:44):
Isn't the greatest sart of computer learning, not artificial intelligence,
but rather human stupidity. Most people believe what they're told
because nobody reads or bothers to you.

Speaker 3 (15:53):
Yeah, it's all it's all the magical thinking, right. People
think this is magic, that it knows everything, that it
can make perfect decisions for everyone. Then I think this
happened before, like in the nineties, when AI started to
beat world champion at chess, you could be like, okay,
we're done, right, Like this is chess for in the West,
for thousands of years has been the game where you
prove you're intelligent, and if AI can do that, then

(16:14):
we're done. But it turned out that after that there
wasn't really all that much more that AI could do,
even though this was really impressive, and so this happens
over and over again. Right at some point, AI beats
the world champion at a game of Go, which is
much harder than chess. It's kind of the hardest board
game you can play.

Speaker 1 (16:31):
Go is the hardest board game.

Speaker 3 (16:33):
I don't know, I'm not an expert, but yeah, it's
it's like this, it's I think you can think of
it as like an East Asian kind of variant.

Speaker 1 (16:40):
Is enjoyables and an enjoy fun game to play.

Speaker 3 (16:43):
I tried. It's way too complicated for me to enjoy.
So yeah, but some people really love it, and you
know once you once you beat that, you're like, okay,
there's no more harder game where humans are better than Ai.
And then you're like, oh, maybe poker, like we're good
at bluffing and reading people. AI starts to beat the
world champions at poker.

Speaker 4 (17:01):
Too many monkeys? Do you ever play that?

Speaker 1 (17:02):
Now?

Speaker 3 (17:02):
What's that?

Speaker 1 (17:03):
What's a card game?

Speaker 4 (17:04):
Okay?

Speaker 1 (17:05):
With your daughter, you'll love it? Uh huh? Can she
count to six yet?

Speaker 4 (17:09):
Yes, then you're in. That'sky neat.

Speaker 3 (17:11):
Yeah. Yeah, we'll try. I'll look it up.

Speaker 1 (17:13):
How come AI can't come up with something where they
can click on photos of bicycles or say I am
not a robot.

Speaker 3 (17:20):
I don't think these things work anymore? Really, like AI
is able to crack all of these they keep changing
them all the time, just to make sure that, you know,
someone who's building the eye for the previous thing has
to spend some time building in the ie for the
next thing. That's why they keep changing all the time.
But I wouldn't really trust them.

Speaker 4 (17:34):
To be so that's not going to save me.

Speaker 3 (17:36):
Yeah.

Speaker 1 (17:36):
Can you walk me through a plausible doomsday scenario where
the machines take over and I'm locked out of my rivian?

Speaker 3 (17:44):
I mean plausible? No, I mean I'm sure your car
can break down. I think your riving or your test
level today already misbehave and not open its doors, has
nothing to do with AI, right, These things are just
not robust.

Speaker 1 (17:55):
That's Do you have a smart home?

Speaker 3 (17:57):
Not really? I have like a Google Assistant too, because
my toddler keeps requesting music, so I don't really want
to like kick on my phone. I just like, please
play this music. So that's my only use case for AI.

Speaker 1 (18:07):
Really, are you into all of that tech stuff or not?

Speaker 4 (18:10):
Necessarily?

Speaker 3 (18:11):
I find most of it just makes my life harder.
I mean there were ads for like Siri and these
types of assistance ten years ago that claims, oh, they
will plan your trips, they will do this, they will
do that. In the end, they never understand what I say,
maybe because I have an accent, and it's just like
there's a few things I know I can ask everything
else I'm just not even trying anymore.

Speaker 1 (18:30):
I always think of like these guys that that create
these billionaires, these tech guys that create these safe bunkers
or whatever, and there's so much tech in these homes.
I'm like, unless the day they're finished, the doomsday happens
and they move in then, but if it sits for
ten years.

Speaker 3 (18:49):
Yeah, they things going to work. They need some tech
support in their bunker.

Speaker 1 (18:52):
Because I can't get my lights to turn on half
the time with my crestraw on app. So I'm just
I'm just loving that Mark Zuckerberg thinks his entire Hawaiian island.

Speaker 4 (19:01):
Is going to work.

Speaker 1 (19:02):
I really just want to know that ten years after
I die, my kids still have it good for a
few more years and then past that.

Speaker 4 (19:11):
I don't really care.

Speaker 3 (19:13):
Yeah, I think you're fine. Also, I'm like not a
fortune teller, right, so, like, I think your opinion is
just as valid as mine on like the future of AI.

Speaker 4 (19:20):
Honestly, yeah, true, no way my opinion.

Speaker 3 (19:25):
I mean something about what you said earlier of like
you know, think of like music, right, Like you can
ask an Ai to day like give me another song
that sounds like Kanye right like or whatever. Like that's
pretty easy actually because there's a lot of style already
there and you just do something in a similar style.
If you ask AI to invent jazz or rap music
or something that's completely different, that's really what it struggles with,

(19:46):
and we don't really have an idea of how to
do that. So AI is really good at like more
of the same. It's it's not so good as like
do something fresh and so I don't know how you
think about your own comedy, but if you think you're
different from all the other comedians, you're probably fine.

Speaker 4 (19:58):
I'm not. I am not that different. Oh all right,
well whatever.

Speaker 1 (20:03):
Do you have any creepy stories about how big data
knows people better than they know themselves.

Speaker 3 (20:09):
I'm sure it happens all the time where people you
know are outed for you know, being pregnant or you
know whatever, just by you know, someone in their home
with the same IP searching for something.

Speaker 4 (20:20):
Right.

Speaker 3 (20:20):
I think this happens all the time. Yeah, I figures
things out pretty quickly. I don't know. I always see
the same ads on Instagram, and they're always like something
that I searched on some other websites. So I really
feel like all the companies know exactly.

Speaker 1 (20:33):
Do you have a burner phone? No?

Speaker 3 (20:36):
Do you have one?

Speaker 5 (20:37):
No?

Speaker 1 (20:39):
I'm not somebody that ever. I don't do things that
matter like there's my wife. I'm never trying to keep
anything from anyone. If somebody, if Big Brother looked in
on me, they would just be like, oh, that is
horribly unimpressive.

Speaker 4 (20:54):
Yeah.

Speaker 3 (20:55):
I mean the problem for me is I grew up
in this generation like early two thousands, the Internet was
only good and by just doing more on the Internet
and making everything open, we would change the world and
everything would be connected and better. And then like ten
years later we realized, oh, and that's not actually what's
happening with the Internet. But I feel like all my
information is out there already, so like, what more do
I save?

Speaker 4 (21:15):
Just porn?

Speaker 1 (21:16):
All the internet did flooded us with porn.

Speaker 3 (21:20):
So yeah, I don't want to get in trouble here
because one of my colleagues in my department at UCLE
actually invented the Internet, so I cannot really says it's
a negative thing.

Speaker 4 (21:27):
I think, you know, come an out, gore.

Speaker 1 (21:30):
Sure does it matter whether or not I accept cookies
on a website?

Speaker 3 (21:37):
It doesn't matter. You'll be tracked anyway.

Speaker 1 (21:39):
So I should just always say accept all.

Speaker 3 (21:41):
I always do you do?

Speaker 4 (21:43):
Yeah?

Speaker 1 (21:44):
I like to get the other one where it's like, oh,
just the selected ones that I like, go through and
I start.

Speaker 3 (21:49):
It just makes your life harder for no good reason,
So just.

Speaker 1 (21:53):
Accept them all and move on. Can you read the
titles of your books?

Speaker 3 (21:56):
Oh?

Speaker 4 (21:56):
No? Please?

Speaker 3 (21:58):
Which ones did you find here?

Speaker 1 (22:00):
The introduction to lied probabilistic interference inference, well, part of
the neural information processing series, and what's this?

Speaker 4 (22:09):
Other one was the query.

Speaker 3 (22:12):
Processing and on prolistic data.

Speaker 4 (22:14):
Yeah, does sound fine?

Speaker 1 (22:15):
Yeah, oh man, I'd love to recommend those to my
wife's book club.

Speaker 3 (22:22):
Yeah, so you invited me to this podcast based on
reading those books.

Speaker 1 (22:25):
There is no way I could read those books.

Speaker 3 (22:28):
Yeah, the titles sound fancy, but those things are not
even really what's the current AI? What current II is doing?
The current AI is actually really simple. That the techniques
that actually worked turned out to be way more boring
and less like technically fancy than what we thought would
be necessary to build AI. And that's also something that
puzzles the community, like how how are all the simple

(22:49):
dumb things working really well? And like all the clever
things don't really work. It's very confusing.

Speaker 1 (22:55):
If AI is so intelligent, how come they can't figure
out that I'm not trying to type duck all the time,
I swear constantly. You think it would pick up on
this at some point?

Speaker 3 (23:06):
Yeah, I share your frustration. So here's the thing, right,
So the AI is really impressive, Like these language molds
are really good at actually giving you answers that are
really clever. But then if you actually want to integrate
this technology into everything, you need to like engineer a
whole bunch of stuff that somehow people are not able
to do in a way that I'm happy with. Either.

(23:28):
Part of the problem is like who's going to actually
build this stuff? Like an AI engineer is so expensive
for these companies, and they're all just trying to build
the next chatchpt to have a nice big press release
about it. But to kind of do the dirty work
of actually integrating all this stuff with all their services
and so on, that's actually quite complicated and expensive, And

(23:48):
you know, I think that's why we're not actually getting
the functionality we want for things we use all the time.

Speaker 1 (23:53):
How do you think we regulate AI? Because if we're
expecting these geriatric dipshits and con risks to wrap their
heads around this.

Speaker 3 (24:02):
I mean it's complicated, right, because I think everyone's confused.
I think the problem is not even them, right if
you ask a lawyer who studies AI regulation, even they,
I think don't really know what is going to happen here.
Like AI companies are using everyone's data. They're probably like
watching this video and putting it as training data into
some video AI model, and the problem is that the

(24:24):
regulation is not super clear about what is fair use
of all this data, Like, obviously you have your copyright
for everything you do, but then somehow companies are still
using it, assuming that if they do it at a
big enough scale they can somehow get away with it.
And then for people like me, it seems obvious that
this should not be legal in a commercial setting. But
then some lawyers are like, no, this seems like fair

(24:45):
use because it's kind of like a child learning and
then making fresh content afterwards, and that is allowed. And
I think someone will figure this out, but it's I
don't think it's clear right now what is legal. And
that's just for the copyright issue. There's also just like
safety stuff, and I think something should really be banned,
like you know, using AI to look at people's resumes

(25:06):
to decide who gets hired or filtered. I think that's
obviously wrong. Those tools can never really do a good
job and be fair.

Speaker 1 (25:14):
Yeah, but interviews are awful. He ANDed me interview somebody.
It's just it's who's the best liar performer?

Speaker 4 (25:22):
To my face?

Speaker 3 (25:23):
Sure, sure, but then what's a I gonna do. It's
gonna be like, oh, hire all people named Daniel, or
you know, like.

Speaker 4 (25:28):
Episode good start. Now I'm bored.

Speaker 3 (25:32):
I feel like you could do a podcast just interviewing
other Daniels, Right, so.

Speaker 1 (25:36):
That's not a bad idea.

Speaker 3 (25:40):
Steal that Daniel and Daniel.

Speaker 1 (25:45):
There's a horrific AI generated video of Will Smith eating noodles?
Does AI have a problem with Will Smith? And do
you think it stems from the movie I Robot?

Speaker 4 (25:56):
No.

Speaker 3 (25:58):
I've seen that video though, and it gets better every year. Right,
there's like the early version which is just chaos, and
recently it starts to look pretty good.

Speaker 1 (26:05):
Yeah, it's figuring in nice learning. Did you ever look
at those AI explicit photos of Taylor Swift?

Speaker 4 (26:12):
No?

Speaker 3 (26:14):
Did you never?

Speaker 1 (26:15):
Are you worried about deep fakes all that stuff?

Speaker 3 (26:18):
Especially in politics? I think that's a problem.

Speaker 1 (26:20):
Do you think of politics in our country?

Speaker 4 (26:21):
You enjoyed it?

Speaker 1 (26:24):
Is it maddening?

Speaker 3 (26:26):
Yeah? It was very like I moved here in twenty fifteen, right,
and the world changed very quickly after that. I felt
like this was kind of a bait and switch, like, yeah,
come to America, it's great, and then I still love
it here. I don't want to get in trouble here. Right,
I love it here.

Speaker 1 (26:40):
Do you think America is the greatest country in the world.

Speaker 4 (26:42):
Yeah? Yeah.

Speaker 3 (26:43):
I mean I'll say one thing positive, which is so
I'm from Belgium, my wife is from Bosnia. We cannot
really live anywhere where not one of us is a foreigner, right,
except in California. I feel like we're both here and
no one cares that we have an accent, and we
really feel like we're both at home. Right. So I
think that's really beautiful about California and the US more.

Speaker 1 (27:00):
Generally, let's just say just California.

Speaker 4 (27:03):
Let's be honest.

Speaker 1 (27:03):
I brought you two into a few different markets and
you'd feel very differently. Is California the only place that
you've lived in the United States?

Speaker 3 (27:12):
Yes? In the United States?

Speaker 1 (27:14):
Yes? Have you visited the whole country?

Speaker 3 (27:16):
I mean, so the thing is in my field, like
we have these conferences where everyone meets to talk about
AI and it's always in some the same kind of
Hilton or Sheraton or whatever hotel in some random city.
So I have seen all the hotels in all the cities.
I wouldn't really say that I've seen the country that much.
I mean, I love to travel around California, and you
know whatever, but.

Speaker 1 (27:35):
California is its own country. You've seen enough. Who is
the most famous Belgian beside yourself?

Speaker 3 (27:42):
Jean Clotland, Jean Claude van Damp.

Speaker 1 (27:46):
Did you like him growing up?

Speaker 3 (27:48):
Yeah? I mean in the nineties I was a teenager,
so yeah, I loved the action movies.

Speaker 1 (27:54):
Yeah, all right, Who's got better chocolate?

Speaker 4 (27:57):
The Swiss?

Speaker 3 (27:58):
Belgium?

Speaker 1 (27:58):
Of course? I mean are you sure about that?

Speaker 5 (28:00):
Mean?

Speaker 3 (28:00):
I cannot Even if I thought differently, I couldn't tell
you on this podcast. Like there's certain things I'm not
allowed to say.

Speaker 1 (28:10):
Do you miss your waffles or no?

Speaker 3 (28:11):
So waffles is only for tourists. Really, waffles are not
a big deal in Belgium. Yeah. So chocolate is real,
beer is definitely real. Waffles are mostly for you guys.

Speaker 1 (28:21):
Oh, we appreciate it. By the way, How do you
have your when you occasionally you've had a waffle, how
do you have Do you have it with syrup?

Speaker 4 (28:27):
With cream?

Speaker 1 (28:27):
Fruit? Never?

Speaker 4 (28:28):
Never?

Speaker 3 (28:29):
I never have waffles ever. I like a like a
French crep, like a pancake.

Speaker 1 (28:35):
That's my I don't have a French crete machine. I
brought you a waffle machine.

Speaker 3 (28:41):
I'll oh no, oh no, you were serious about getting
me a waffle maker.

Speaker 1 (28:47):
But it's little cars and truck oh, different things.

Speaker 4 (28:52):
Yeah, my daughter will love this, love love waffle maker.

Speaker 3 (28:56):
Thank you.

Speaker 4 (28:56):
Okay. Yeah, I don't know why I have that.

Speaker 3 (28:59):
Feels mildly offended.

Speaker 1 (29:00):
But you know, is that offensive to give you a
wall of making No?

Speaker 3 (29:03):
No, no, it's okay. I actually would like one. I
don't think we have one, so I'll take it. I'll
take it.

Speaker 1 (29:07):
Yeah, and then the next get I got you. I
love board games and I know that you know, you
played chess and stuff, so this is a but I
don't like this game and I never got into it.
Everybody buys me games because they're visually pretty. What is
this a trio? Oh?

Speaker 4 (29:24):
Trio O trio.

Speaker 1 (29:26):
Okay, but it's like a pretty game.

Speaker 3 (29:28):
Oh yeah, I could put this.

Speaker 1 (29:29):
Uh, and it's got pretty pieces they go into it.

Speaker 3 (29:32):
But it's like a decorative item. This looks cool.

Speaker 4 (29:34):
Thank you.

Speaker 1 (29:34):
Well, yeah, I'm just like I'm gonna play this game.
It's not my game, Okay.

Speaker 4 (29:40):
I stick with my rummy cub. Thank you, cube? What
do you call it? Cube? But I forget.

Speaker 3 (29:45):
Don't ask me. I don't speak your language.

Speaker 4 (29:47):
How many languages can you speak?

Speaker 3 (29:49):
Speak Dutch, French, English, a little bit of German.

Speaker 1 (29:52):
I was born in Germany.

Speaker 3 (29:54):
I actually learned. Yeah, I watched your interview with the Schiff.
You were in bull Parks, you said in that song.
I was actually there.

Speaker 1 (30:00):
We were in both part.

Speaker 3 (30:00):
Yeah, yeah, so what happened. There was a workshop for
people to discuss AI in this weird hotel castle. It's
owned by the guy who invented gummy bears, harrybo gummy bears.
So we were just there talking about AI and then
you know, they were just gummy bears everywhere, and like
everyone's getting sick, sick eating gummy bears off the table,

(30:21):
and yeah, so that's why I was in both parts.
We have a lot of get togethers in German castles
in the middle of nowhere to discuss about AI.

Speaker 1 (30:29):
It's the thing that is the greatest thing you've ever said.
He's in a German castle eating gummy bears.

Speaker 3 (30:36):
In Beau part with like, what are these animals that
get hunted and stuffed and then put on the wall
like a rhino an elephant?

Speaker 1 (30:43):
You ever shot a gun in your life?

Speaker 3 (30:44):
No, I don't really want to.

Speaker 1 (30:46):
I've never shot one either.

Speaker 3 (30:47):
No, No, I had the opportunity. I was in Vietnam
and they're like, here for one dollar, shoot this machine gun.

Speaker 1 (30:52):
I'm like, no, thank you, you served in Vietnam.

Speaker 3 (30:54):
Yeah, I did.

Speaker 4 (30:55):
Thanks appreciate that.

Speaker 1 (30:57):
Yeah, I wasnting funny when Hollywood tried to guess the
future in movies and you're like, oh, this is such
a bad attempt. You can't predict and it just looks
so bad.

Speaker 3 (31:08):
Yeah.

Speaker 1 (31:09):
Do they ever come calling like, hey, we're trying to
figure something out.

Speaker 3 (31:12):
No, because I'm not like a futurist, right, Like imagining
interesting future world is not really my job.

Speaker 1 (31:17):
What's your what's your favorite AI movie?

Speaker 4 (31:20):
Yeah?

Speaker 3 (31:20):
I knew you were going to ask it, and I
don't really have one. What I really like? I really
enjoyed the first season of West World, which is not
a movie of course, like the TV series it so mad.
I started like thinking, am I a robot? Like they
really did? I watched too many episodes in a row,
but like it really got to me.

Speaker 1 (31:36):
But it fell off the rails and the following season yeah, yeah,
a little bit. I want to closure in season one.
They won't ever do that with a good show. Just
one season, just figured out, and.

Speaker 3 (31:48):
I mean, this is an American thing. In Europe everything
ends after two seasons. And I'm like, okay, done, let's
not ruin it.

Speaker 1 (31:53):
Not everything in your great British Bakoff's been going for
a hundred years. Do you like that show or no?

Speaker 3 (31:59):
I do watch I watch all of it.

Speaker 1 (32:01):
Yeah, it's very enjoyable.

Speaker 3 (32:03):
Yeah, I'm waiting for you to give me a handshake.

Speaker 1 (32:06):
Oh, just reach across.

Speaker 3 (32:08):
I haven't earned it yet.

Speaker 4 (32:12):
Your your wife, you're married? Yeah?

Speaker 1 (32:15):
Is she smart too?

Speaker 3 (32:16):
She's very smart. She's she's smarter than me for sure.

Speaker 4 (32:18):
Yeah.

Speaker 1 (32:19):
Okay, is she really smarter than you?

Speaker 4 (32:21):
Is that something?

Speaker 3 (32:21):
It's obvious we play board games. She wouldn't every single time,
Like it's not even a competition.

Speaker 1 (32:26):
What if she's a cheater? Do you think there's a possibility.

Speaker 3 (32:28):
No, no, no, she would be No, no, she she's
very principled. Yeah, much more than I am.

Speaker 4 (32:35):
Yeah, both you to take IQ tests.

Speaker 1 (32:36):
Okay, that's what Eddie and his wife did. Eddie and
I did an IQ test. Do you do you know
what your IQ is? No?

Speaker 3 (32:42):
I never wanted to do with this.

Speaker 1 (32:44):
Eddie and I did it.

Speaker 3 (32:45):
Yeah, what was it?

Speaker 1 (32:46):
Well, he was smarter than me, and that's all that
I cared.

Speaker 4 (32:48):
About, and I was upset.

Speaker 3 (32:49):
Were you were you hired in one hundred?

Speaker 1 (32:52):
What you were? Yeah? We were in hundreds? Were you
one hundred and thirty thirty one thirty one?

Speaker 3 (32:56):
I think you were one twenty nine and Megan was
one twenty seven.

Speaker 4 (33:00):
Smarter than as White. That's that's that's I'll take that, notch.

Speaker 3 (33:02):
I mean, I remember in high school, like everyone was
doing these tests for being like gifted children, and so
when I felt like whenever someone was like diagnosed as gifted,
it kind of messed them up in a way. And
I was like, yeah, I don't really need to do
you like knowing everything depends not really.

Speaker 1 (33:19):
Okay, are you teaching now?

Speaker 4 (33:20):
Are you a professor?

Speaker 1 (33:21):
Undergrad or graduate?

Speaker 4 (33:22):
So both?

Speaker 3 (33:23):
So I teach undergraduate one court or the other courter
graduate students, But most of my job is actually research.

Speaker 1 (33:29):
Are allowed audit classes?

Speaker 3 (33:30):
Oh yeah, of course. Yeah, just to come over. You
can even know the exam. I'll grade your exam for you.
You don't have to pay tuition.

Speaker 4 (33:36):
No, I still have nightmares about that I didn't finish college.

Speaker 3 (33:39):
I had those for a long time. Now it's like
now that I'm a professor. I'm afraid that I'm not
showing up for the exam, that I'm organizing myself, like
I oversleep or like it changes. But yeah, those nightmares
are still.

Speaker 2 (33:50):
There.

Speaker 1 (33:50):
Are kids using AI constantly to cheat in your classes?

Speaker 3 (33:53):
Uh yeah, I think it happens a lot. But because
a lot of stuff I do is just matt it's
kind of harder to cheat. But I think if you
have to kind of write things and kind of do
more creative work, I think it's much easier to cheat.

Speaker 1 (34:04):
Is cheating something that you care about as a professor.

Speaker 3 (34:07):
I mean, in the end, I only care if people
learn something. If they somehow cheat and still learn something,
I guess I'm fine with it. But it's just the
fairness of it, that's you know. I feel like it's
my job do.

Speaker 1 (34:16):
The old school just looking at somebody else's.

Speaker 3 (34:18):
Oh yeah that happens still.

Speaker 1 (34:20):
And still happens.

Speaker 4 (34:21):
Yeah, yeah, yeah, that's good. Yeah yeah yeah.

Speaker 1 (34:23):
I just well, I was like, oh, I don't even
know what I would do anymore.

Speaker 4 (34:27):
I'd be so scared. In college. When will you have tenure?
How long?

Speaker 2 (34:31):
Oh?

Speaker 3 (34:31):
I got it four years ago?

Speaker 1 (34:35):
Yeah? Would you ever go to a different school or
do you think you're gonna.

Speaker 3 (34:39):
No, I don't think I would want to move. I
really like it here. Also, living in LA I feel
really lucky. If you're going to be a professor in
your field, usually don't get to choose where you live,
and you typically end up somewhere in the middle of
nowhere or some other country. And I'm just really lucky
that I'm in LA which is somehow the best place
to live.

Speaker 1 (34:56):
During this mass exodus that they always talk about if
people believing California.

Speaker 3 (35:01):
Do you actually know anyone who left? I feel like
this is something I only read about on Twitter from
people from Texas. I don't know, I've never actually seen
this happen.

Speaker 1 (35:10):
Well, I looked at just actual numbers, and they like
talk about like, oh like one hundred thousand people moved
to Texas and forty thousand people from Texas moved to California.
But the one thing that I enjoyed was just that
highly educating people are still flooding to the state of California.
So I'll take it.

Speaker 4 (35:29):
Yeah.

Speaker 3 (35:29):
I mean also at UCLA, I think we're lucky. One
of the reasons I love to work here at UCLA
is because we get people from all over the world
that want to come here to do research and science
and engineering, and so you know, it's still kind of
displeased that everyone wants to go to and that's just
you know, if we get good students, that makes my
life so much easier. I'm a little lazy. I don't
really want to think too hard. But then if the

(35:50):
smart students come and work with me, then makes my
life very easy.

Speaker 1 (35:53):
The campus of UCLA has always confused me. It's a
just smick dab in the middle of like of a city.

Speaker 4 (36:03):
It's just so weird.

Speaker 3 (36:04):
It's like between the Playboy mentioned and bel Air and
Beverly Hills and yeah, it's like crazy.

Speaker 1 (36:10):
Do you have a drive around UCLA's campus.

Speaker 4 (36:12):
Oh yeah, yeah, beautiful.

Speaker 1 (36:14):
It's beautiful, right, but it's just so weird. It's like, oh,
I was just on Wilshire five seconds ago, and now
here I am watching kids with backpacks, like have a
college experience that is unlike anything else.

Speaker 3 (36:27):
I'll give you a campus store and show you, show
you around whenever you're around.

Speaker 4 (36:31):
I appreciate that.

Speaker 1 (36:32):
What's the big difference between living in Belgium and living
in Los Angeles, California?

Speaker 3 (36:40):
I mean, there's a good in the bed. So the
good is that it's like everything's so convenient here. The
bad thing is that in Belgium everyone's kind of equally
rich and poor, so you don't really feel bad about what,
you know, differences in wealth, and here it's kind of
crazy and uncomfortable.

Speaker 1 (36:55):
Well that's why you just you stay in your lane.
So go to those those nights areas. That's why U
see La is a bad place. Oh yeah, so close
to insane.

Speaker 3 (37:03):
Well I moved south because I'm like this is depressing
everyone so rich here?

Speaker 1 (37:07):
Yeah, agreed?

Speaker 4 (37:10):
How far south did you go?

Speaker 3 (37:12):
I live right here in Marvista.

Speaker 4 (37:14):
Okay, yeah, you ever go to a Dodgers game? Never been?

Speaker 1 (37:18):
Do you understand baseball?

Speaker 3 (37:19):
Not really? But I do. I do really enjoy even
if I don't understand the sport. I like the American
entertainment in these stadiums, Like it's very different from Europe.
In Europe you're like standing in the colt, nothing happens,
but the game here it's like a big like Disneyland circus.

Speaker 1 (37:32):
I love it.

Speaker 3 (37:33):
It's just fun.

Speaker 1 (37:34):
I mean a Dodger game, then nothing's better than going
to a Dodger game. It's just so fun, so pretty.
I only and I'm one of those real real fans,
like La fans, where I just go once a year
and I go for like the second inning to the
fifth inning. Then I just leave right in the middle.

Speaker 3 (37:51):
Did you ever ask you to do like the pitch
the yeah? Aren't you like eligible for that as a celebrity?
I feel.

Speaker 1 (38:01):
They have never They've never asked. One time they had
given me free tickets and they revoked them because like
that week I said something horrible.

Speaker 3 (38:11):
Oh you're a little bit too.

Speaker 4 (38:14):
Yah.

Speaker 1 (38:14):
But I would like to throw on a first pitch, man,
I would put some smoke on that. Dodgers, won't you
give me a ring? See if I'll throw out a
first pitch?

Speaker 3 (38:22):
Invite me when it happens.

Speaker 4 (38:24):
Do I have to get like your whole family?

Speaker 3 (38:26):
And no?

Speaker 4 (38:27):
Okay, they don't care. They don't care. Well, he, I
appreciate you being here again. Thank you very much.

Speaker 3 (38:34):
Thank you.

Speaker 4 (38:35):
Okay, take care, have very soft hands, do I?

Speaker 1 (38:38):
Yeah, Pasha, all right, I want to thank Professor he
von Denbruch for being on the show. We learned so much.
And I'll be honest with you. I didn't notice any
audio glitches. Good job, Dylan. Speaking of AI, I've been dabbling.

(39:04):
I don't know if you guys watched. It was last
week Game two in Boston, the Miami Heat versus the Celtics.
You know how they on the on the floors now
they're putting ads. Well, I hacked it. And guess what,
had a little fun with the people in Boston. Only

(39:26):
if you if you're white. It was only up for
a few seconds. That's just that's just funny, right there,
Paul Piers and his dirty shorts needing a wheelchair. They
you know, I only had a small window to hack
the system. I got in one more time, as is
what I put up. Ah the catch, Oh it stings deep, Carl.

(39:51):
You see who's here. Ava. Ava is a Now some
people may remember Ava from Tosh point zero. She looks
a little different. Okay, she's very old, she's got a
lot of dementia. She gets very startled, very easy. No
matter what you do. You're sneaking up on her, which
is good when you have little kids, because they just

(40:12):
walk up to her and smash her in the head.
And she's like, what's going on? This is my life?
What does it become? All Right? We got some plugs,
Carl boyswearpink dot com. Check out our charitable clothing line
for toddlers. I'm almost at a break even point with
that company. That's an exciting milestone. The Goat. Big news

(40:36):
about The Goat now May ninth is when it premieres.
They're gonna drop three episodes at once. But now they
told me on YouTube only May second, they will drop
the first episode. So a week earlier than the premiere

(40:57):
is one episode, but then a week later May ninth,
on the premiere, there'll be three episodes available. So let's
go ahead, and let's go Carl, let's run that back.
The Goat premierees May ninth, but on May second, you
can watch the first episode huh interesting, only on YouTube.

(41:18):
Then on May ninth, you can watch the first episode
again on Prime, but you can also watch the second
and third episode. You know, the way television was meant
to be viewed. All right, I've got some tour dates
that I'd like you to go. See me at and
what else my son's bedtime story. Enjoy this little bit

(41:44):
of animation. See you next week.

Speaker 5 (41:46):
Once upon a Hi, dear it Twitter ammals, all you
wanted to do it wide on a twin but a
Twitter put that an edition. It all the boat detached
and says they wanted off to play, but everyone a point,

(42:10):
they are so so happy, and then an d and
but there are too little pain. But the pigments were
like the penguins was sick.

Speaker 6 (42:29):
That was like one of the many in what this
is like one of those movies at the end where
it ends the credits rule and then they have like
some extra little scenes before the movie finishes.

Speaker 5 (42:41):
They talked to MADMT the other extra.

Speaker 6 (42:44):
Scenes in Top Gun Maverick, Yes, all right, then yes,
like Top Don Maverick and.

Speaker 5 (42:51):
The Only Soviet Days Length And then they went to
freepd end. Okay, they always wanted gender Red Dean
Advertise With Us

Popular Podcasts

Dateline NBC
Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.