All Episodes

October 30, 2024 38 mins

In this episode, Gandhi talks to Director of the White House Office of Science and Technology Policy, Arati Prabhakar, about all things AI. How can it change health care, how can it protect children, will it predict the weather? We also talk about the details of the Diwali celebration Gandhi attended at the White House. 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Hey, it's us on the side. I'm Gandhi and today
doing something a little bit different. I'm here with my
lovely producer Andrew.

Speaker 2 (00:11):
Hello.

Speaker 3 (00:11):
Hello, how are you today?

Speaker 2 (00:12):
Oh God, here we go.

Speaker 4 (00:14):
How many episodes of any podcasts have you done? Hundreds?

Speaker 3 (00:17):
Yeah?

Speaker 4 (00:17):
Yeah, that's your podcast voice.

Speaker 3 (00:19):
Yeah, okay, hello, Hello? Are you to another episode of
Sauce on the side.

Speaker 4 (00:25):
That's like half your Scotti impression?

Speaker 3 (00:27):
Yeah, I feel that that's the most radio presenter voice
I have.

Speaker 4 (00:31):
Oh my god, what radio presenter speaks like that.

Speaker 3 (00:34):
There's a couple Okay, cool, let's just say I was
watching a few videos yesterday and there's a couple of
radio presenter people that definitely happen.

Speaker 4 (00:42):
Do we know them?

Speaker 3 (00:43):
Oh yeah, oh yeah.

Speaker 1 (00:45):
So my voice sounds a little bit odd still, and
in the rest of the episode you're going to hear
it sounding pretty bad because I think I got a
damn cold at Danielle's damn haunted house for the second
year in a row. I think maybe the little closet
I hang out with, the little closet I hang out in,
I should say also with maybe there's like something happening
in there. I don't know, but two years in a row.

(01:06):
Next year, I got to figure out something different, yeah,
or just not show up.

Speaker 3 (01:09):
I don't know.

Speaker 4 (01:10):
I don't know.

Speaker 3 (01:10):
Be a bush.

Speaker 1 (01:11):
Yes, great, But I also have been just kind of
like running running. And one of the things that I
did was go to Washington, DC over the weekend and
then on Monday go.

Speaker 4 (01:21):
To the White House.

Speaker 3 (01:22):
I'm so jealous.

Speaker 4 (01:23):
I would love for you to come with me next time.

Speaker 3 (01:26):
I think, yes, that is my goal. One of the
things I want to do is go to the White House.
I want to tour up the White House more than anything.

Speaker 4 (01:33):
Did you not do it in like eighth grade?

Speaker 2 (01:34):
No?

Speaker 3 (01:34):
Really, we did not do quest trips? Oh we did,
but it was like not fun.

Speaker 4 (01:38):
With kind of hellish school.

Speaker 3 (01:39):
Did you go to it's a Catholic school trips? He did,
but it was like stupid ones, Like we did a
ropes course.

Speaker 4 (01:46):
That sounds awesome.

Speaker 3 (01:48):
The White House sounds a little better.

Speaker 1 (01:49):
Yeah, But you know, when you're in like eighth grade
and you're walking around doing that stuff, it's more like
torture than actually, whoa, this is cool, This stuff is history.

Speaker 4 (01:55):
You don't think about that.

Speaker 3 (01:56):
Yeah, I mean your pictures from this past trip, it
looks amazing, so thank you. I'm very chealous.

Speaker 1 (02:04):
It was it was cool, but I know that you
wanted to ask me some questions. Should we do it before,
should we do it after?

Speaker 3 (02:09):
Should do it after?

Speaker 4 (02:10):
Okay? So I got to talk to a very lovely woman.

Speaker 1 (02:13):
Her name is Arthy Praboker, and she is the Officer
of Science and Technology.

Speaker 4 (02:21):
For basically the White House and all of us.

Speaker 3 (02:23):
That's insane. Yeah, that's so cool.

Speaker 1 (02:26):
She does a lot of really cool stuff. She works
with AI. I, you know, have a gazillion questions about that.

Speaker 4 (02:30):
So let's just get to her. And by the way,
I got to.

Speaker 1 (02:32):
Do this interview in the Eisenhower Executive Office building, which
was so cool, and I learned that they were thinking
about taking that building now and tearing it down.

Speaker 4 (02:40):
It's so cool looking.

Speaker 3 (02:42):
Okay, I have questions, Okay, the interview.

Speaker 1 (02:44):
We're gonna get to the interview and then answer has
this question. Here we go, you hear my voice is
still horse as it has been for a while. And
now I am at the White House. The party continues,
and I am with the director of the Office of
Science and Technical Art, the Praboker, and we're going to
talk about a lot of really cool stuff because she

(03:06):
is the director of the Office of Science and Technology,
specifically here at the White House.

Speaker 2 (03:10):
Would we say that I get to be President Biden
Science and Technology Advisor and sit in this office at
the White House. Absolutely?

Speaker 1 (03:17):
Okay, So I have maybe a million questions for you.
I know our time is limited, so I really want
to get to well, first of all, what is your background?

Speaker 4 (03:24):
And how did you get here?

Speaker 2 (03:26):
How did I get here? Let's see. So the President
called to the White House called in the summer of
twenty twenty two to ask if I would be willing
to be considered for this job, and to which my
answer was a yes please. And the reason is I
had gotten I spent half of my professional life in
public service, and that included running DARPA, the Defensive Master

(03:47):
Research Projects Agency. This is a Defense Department agency that
started the Internet and stealth.

Speaker 4 (03:52):
And sort of started the Internet.

Speaker 2 (03:54):
Literally started the Internet. I literally invented and built the
beginnings of the Internet. That was a long time. I
was not responsible for that.

Speaker 4 (04:01):
I didn't ask how long time?

Speaker 2 (04:02):
You don't look like that makes revolutionary things happen. And
then I also ran a very different organization, NEST, but
I had I had seen the government's part of research
and development, how we make science and technology happen from
a lot of perspectives. And then the other half of
my professional life with Silicon Valley and my home is
Palo Alto, and that was that life was mostly early

(04:26):
stage venture capital investing in a little bit a couple
of companies, and so I had seen innovation from all
of these different perspectives. And that's my great passion is
how do we How does this country keep inventing the
future and building the future that we all want? And
I know we do it, and I know we only

(04:46):
can do it because of public and private working together.
And this is this perch in the White House, helping
the president and then leading this office at the White House.
This is the only place where you really get to
do that is your day is to think about and
work on that whole innovation.

Speaker 4 (05:03):
That is wild.

Speaker 1 (05:04):
So when you say I got a call from the President,
did he just call you him?

Speaker 4 (05:07):
So I was, But there was an theell phone number.

Speaker 1 (05:11):
Do you have it?

Speaker 2 (05:12):
But we occasionally you get a phone call from a
White House number, and if you've been in Washington, you
know to look for that prefix. So yeah, I actually
literally what happened is I got an email from the
Presidential Personnel Office and they asked if they could have
a call with me, and I'm you know, when when
you get one of those shooting you should often you

(05:32):
know what it's for. And that's that's what I started.

Speaker 1 (05:35):
H So you talk about the future of technology, the
future of all of us, right, And one of the
things that I am.

Speaker 4 (05:44):
Very excited about my sister and I'm my sister's here
with me.

Speaker 1 (05:48):
We grew up in a time and in this country
where Indian people, Indian celebrations were not really a thing.

Speaker 4 (05:56):
We celebrated in our household mildly with our parents. You know.
My mom was out religious.

Speaker 1 (06:01):
Of my dad, she would want to give us gold
and that was kind of the extent of it. And
now we're having the Valey parties at the White House.

Speaker 2 (06:08):
Is at amazing.

Speaker 4 (06:09):
It's incredible, it really is. The field is good cool.
I appreciate being here.

Speaker 1 (06:15):
I know you you are obviously your Indian, you were
born in India in or urban Texas. You've seen more
than we've seen as far as this change goes.

Speaker 4 (06:23):
How is it for you?

Speaker 2 (06:24):
It's really amazing and it just reminds you what America
is all about. My mom brought our family here in
the early nineteen sixties, and the first of all, there
were very few women. The Vice President's mom was one
of those pioneers, and it's sort of amazing to think
about these women coming in that time period. My mom

(06:45):
brought us here in the early sixties and by the
time my family we came when I was three, and
then over the course of a few years, we found
our way, our way to love with Texas, which is
where I had many of my formative years. I went
to high school and loved it, went to college and Lubbock.
And at the time that I went to high school
in Lubbock, I was the only Indian kid in my

(07:05):
high school, you know, graduating class six or seven hundred,
and it was a time, it was the seventies, and
people would ask, without irony or humor or malice, they
would say, what tribe. When you said the Lord, you
were India, right, And it's because they knew Native Americans.
They didn't have met anyone from India. So so you
were talking about Dvali. I remember celebrating Duali at our

(07:26):
house and you know, we would put light to candles
and drip the wax and put them on our house.
And we had friends who would come over and all
of our American friends thought it was really cool. But
it was definitely home homebrew because that's all you had.
Right And now to fast forward and to be here
at the White House and to find that the President's

(07:48):
hosting Dvali and it's going to be a very big
gathering tonight. That's it really speaks to the journey of
this community. But also that's that's what this country is
all about.

Speaker 4 (07:58):
It really is.

Speaker 1 (07:59):
And I am so appreciated to be here. So I
like to think everybody for inviting us. We will not
embarrass your health too badly.

Speaker 2 (08:05):
Okay, well we're going to hold you to that.

Speaker 1 (08:07):
Okay, too badly being the keyword, a little bit of
a not too bad there.

Speaker 2 (08:10):
I'm not going to ask a lot of questions. Now,
what's that?

Speaker 1 (08:13):
So let's talk about all the things that you are doing. So,
first of all, a lot of people don't know what
an executive order is. So when you introduce an executive
order regarding AI, can you explain what that means?

Speaker 2 (08:25):
Yeah?

Speaker 4 (08:25):
Asking for a friend?

Speaker 2 (08:26):
Yeah, absolutely absolutely. So the way we make laws in
this country is with the legislative branch of government that's Congress,
and an executive order is when the president takes action
within existing laws. It's not new laws, but this is
the president saying, these are the things we can and
must do under existing law, and in particular and artificial intelligence,

(08:50):
which served into the forefront over the last couple of years. One.
There's a lot to say about AI or what's going
going on, but one of the important things that this
president did was to put in place in executive order
where he said, we're going to pull every lever under
existing law.

Speaker 1 (09:08):
Okay, it's really hard for policy to keep up with technology.

Speaker 4 (09:12):
At the pace of technology is moving.

Speaker 1 (09:13):
Through that we're seeing a lot of things kind of
popping up that are worrisome, terrifying, also really fascinating and encouraging.

Speaker 4 (09:22):
Right, all of the above, Right, how do you start
to regulate these things?

Speaker 1 (09:25):
Specifically, let's start with likeness of people, women, children, Oh,
how do you move through that moves their huges.

Speaker 2 (09:32):
Let me let me start with the big picture. So
when chatbots and image generators burst on the scene, Look,
everyone understood, I mean, certainly the president, the Vife's president understood.
AI was already in our lives, right, So when you
go online and you pay whatever you get charged for
an airline ticket, AI was behind that. And when you
go to a hospital, whether what kind of a diagnosis

(09:54):
or treatment you get. So AI was in our lives,
but it was hidden. And then chatbots and image generators
all of a sudden they are was in our faces.
And it was a big moment. And what happened is
that the President and the Vice President sees that moment
to say, look, we've got to get on the right
track because this is a really powerful technology just exactly

(10:17):
to your point. It brings promise and peril both they
come together, and we've got to do two things. We
have to manage its risks and we have to seize
its opportunities. And that was the mission that they gave
all of us. And then we got to work. And
the mechanics of it is working with Congress as they

(10:37):
get came up the learning curve and as they start
moving towards we hope good legislation, working globally, working with
companies and calling on them to make voluntary commitments. The
executive order that the President signed, which was you know,
we're going to work with everyone else, but we're going
to do our work here in the executive branch. Stand up,
so one of like one is what happened, then well
let me try what happened. Some eight examples of how

(11:02):
we're starting to create an environment in which AI is
being useful but we're but we're managing its harm. So
when you go to a bank today and you ask
for a credit card or mortgage, almost always it's an
AI system that evaluates your application, and today you are

(11:23):
owed an explanation when you get a yes or no,
and so you can do something about it. The reason
that requirement is error is because of our Consumer Financial
Protection Bureau. They put that rule in place and so
that the consumer, the finance consumer, has more of a say,
if you go today to a hospital, the AI system
that is being used to diagnose and treat you to

(11:44):
figure out what the right course of treatment is has
to have proven that it's it's not embedding all the
bias of bad decisions from the past. That's because of
the rule of our Department of Health and Human Services.

Speaker 1 (12:05):
Now, when you say it's not using information all the
bad information in the past, what exactly does that mean?

Speaker 2 (12:11):
What's unpacked back because this is super important. So there's
a long history of medical decisions that were made that
were based on skin, the color of a person's skin,
the amount of money in their pocket, the neighborhood that
they lived in, whether or not they had insurance. They
aren't about medical factors. And when you take those biased
decisions of the past and you just blindly train an

(12:31):
AI model on them, what you are doing is something horrible,
which is you are intensifying and distilling all of that
bad decision making of the past. If you correct for
that and you recognize that some of those decisions in
the past were bad, and you train on good data
that tells you what kind of treatments really lead to

(12:52):
better outcomes for people, then you can actually get the
power of the positive benefit of AI that you can't
just throw a bunch of data at it and hope
it comes out. Okay, you really have to do the work.

Speaker 1 (13:04):
So let's break that down for people who might be
a little lost.

Speaker 4 (13:08):
In all the sauce here. We like to talk about
sauceage sauce on the side.

Speaker 1 (13:11):
Okay, Yeah, Essentially, what you guys have done is start
to reverse the damage of years decades generations of this
because we know AA has been around, like you said,
for a very long time, mining all of this info
and intentionally or unintentionally separating groups and keeping certain groups
worse off than others. And what you guys are doing

(13:34):
is stepping in and fixing this.

Speaker 2 (13:35):
Now, one step, little done, one step of the time,
saying let's make sure we're not exacerbating bias.

Speaker 4 (13:41):
Right, how far long in the process do you think
you are?

Speaker 2 (13:43):
Right now?

Speaker 4 (13:44):
From like one to one hundred. One hundred is perfect.
You started at nothing, but I.

Speaker 2 (13:49):
Would say what has gotten done is a great start,
and there is no question there's a lot more work
to do. First of all, because AI is being used
for more and more purposes and every one of them.
But we've been talking about the bias risk, but there
are also risks of exposing private information. Oh yeah, there
are risks to work.

Speaker 1 (14:08):
You know.

Speaker 2 (14:08):
We can either do this right and use AI to
let people do more and earn more. That's the future
we need to build. But if we don't pay attention.
We already know there are employers who use AI to
surveil their workers, to dehumanize their work, to hollow their
workout and often to replace workers sort of blindly without

(14:30):
really thinking about the consequences. So again, I think in
every case, the issue is how do we start being
very purposeful so that we get to the future we want.

Speaker 1 (14:40):
So when you talk about being purposeful and protecting women
and children, what are some of the steps that everyday
people can take. Yeah, that don't know anything about AI,
that don't know that these things can happen, What do
you do?

Speaker 2 (14:50):
How do you know? I just yeah, I think a
great place to start is just to be aware of
how many places it's embedded in the world around us,
and we're doing the work that we need to do
as a government to make to make sure that things
that are illegal aren't you know, happening anyway because of AI.
Our regulators really clear they're getting better and enforcing as
they understand how AI is coming into lots of different places.

(15:13):
But we all need, each of us as individuals, need
to be smart consumers. So I'd say that's one is
just be smart about what's coming at you and recognize
that you know, there's an algorithm that is feeding you
the next thing in your social media feed, there's an
algorithm that's feeding you the next ad but also deciding
pricing and a lot of these other things. And then
once once you're savvy about what's coming at you think

(15:35):
about where you can use AI in your life to
do more of what you want to do. And I'll
tell you. If you are a student, there are more
and more AI tools that can help you learn at
the leap. You know, right where you are. They can
help meet you right where you are and figure out
what you need to learn next. If you're a small
business owner, there are more and more tools to help

(15:56):
you with your marketing, with all aspects of your business.
So you know, get out there and start exploring and experimenting.

Speaker 1 (16:05):
So with AI and using likeness, a lot of really
sketchy things can come about from it. I assume you
got if we've thought about it, you guys have already
thought about it.

Speaker 2 (16:16):
Huge issue.

Speaker 1 (16:17):
Yeah, what happens in the case of a generated image
of a child in a position that none.

Speaker 4 (16:24):
Of us want to see a child in.

Speaker 1 (16:26):
Exactly, it's not real, it's nothing real, But the people
who are consuming it are still the same level of
creepy for consuming this thing that is not real. Where
does that fall in all of this? Yeah, as far
as legislation as far as regulating it.

Speaker 4 (16:39):
How does that work.

Speaker 2 (16:40):
Well, let's let's start with a year ago when the
President signed in the executive order we had. We were
thinking about and trying to figure out what the various
kinds of risks were from this new surgeon AI and
we were the question we were asking ourselves was which
of those risks would turn into real world harms first.

(17:02):
And I'll tell you, I tell you, I now think
we have the answer, and it's an ugly answer. And
what it is is deep fake nudes, right, sexual abuse,
So that it's what you know. With these amazing image generators,
it's just way too easy to take a really simple
photograph or sketch of often it's a woman, often it's

(17:23):
a girl. Often it's someone who is gay or queer,
and take a person and turn them into a deep
fake nude that goes out into the world. Well, that's
happening at really an alarming scale today, and it is
you know, it's just unacceptable. And this is an area

(17:44):
where Congress has considered legislation. There are a couple of
proposals that, if they can make progress, I think would
start making real impact on this problem. But while that
is happening. We wanted to make sure we didn't wait,
and we here at the Office of Science and Technology
Policy the White House who teamed up with the Gender
Policy Council that works on these issues, and together we

(18:07):
put a call out to the industry to say, what
can you do? Step up now so that we can
deal with this immediate and urgent problem because you can
take actions immediately. So some of that is about the
AI companies that have image generators. Some of that is
about the payment processors. Often they have terms of service

(18:29):
that say we won't provide payment processing for different kinds
of harmful activities. Sometimes they have those terms, but they're
not really enforcing those. So and there's much more that
the platforms can do, so different parts of the tech
and innovation system. Actually there are things that they could
do immediately, and to our delight, a few companies did

(18:51):
start stepping forward and making some voluntary commitments that I
hope will start making some progress on this really ugly problem.

Speaker 4 (19:00):
It is a really ugly problem.

Speaker 1 (19:01):
On the flip side of it, if you think about
people who have gotten caught red handed doing things, it's
almost like we're going to get to this place of
Shaggy Song.

Speaker 4 (19:08):
It wasn't me. You caught me red handed. Now, I
was a deep think it wasn't me. I think we
I think, yeah, I'll tell you.

Speaker 2 (19:14):
You know, we all worry about bad information or bad
imagery leading people to believe things that aren't true.

Speaker 4 (19:23):
And I feel all about now. Shoot, yeah, I'll tell you.

Speaker 2 (19:25):
An even deeper problem is that they cause people to
not believe in anything, and that erosion of trust is
so damaging to our country.

Speaker 4 (19:36):
We're seeing it every day right now. How damaging it
actually is? How do you fight it? Give me some
hope here?

Speaker 2 (19:41):
Oh? You know, I think in a lot of ways.
I just think we have to be very very clear
about how important freedom of speech is in this country,
and it is a bedrock principle for our democracy. I
keep coming back to what each one thing that has
to happen is that each of us have to recognize
who our responsibilities are to be to be an intelligent

(20:05):
consumer of all the things that are hurled at us.
And after you peel it all back, a lot of
this is about judgment, and I think we have to
make smart judgments about all the things that are coming
at us.

Speaker 4 (20:17):
You really ask me for a lot. I know, I
asked a lot from the everyday.

Speaker 2 (20:20):
Person, but you know what, the future demands a lot
from us. That's okay, it's worth it.

Speaker 4 (20:25):
I agree. So let's talk about happy things in the future.

Speaker 2 (20:28):
Okay.

Speaker 1 (20:29):
AI got to get there, Yes we are, I know,
because there's just so much terrifying stuff that happens.

Speaker 4 (20:33):
We read a different.

Speaker 1 (20:34):
Story about AI every day where something went wrong, where
a robot arm in a factory, crystal person or really unfortunately,
this young boy in Florida, Oh had a relationship.

Speaker 4 (20:43):
With a chatbot. It is terrible. And then in Okay,
so before we get to the happy stuff, I have
a question about that.

Speaker 2 (20:50):
Okay.

Speaker 1 (20:51):
In that story, and if you don't know what happened,
there was a teenager in Florida who was basically in
a relationship with a chatbot. He did know that it
was a chatbot, but this bought learned all about him.
And now his mother is saying that the chapbot actually
encouraged him to take his own life and he did.
What happens in a case like that, is somebody held
responsible or is that just chopped up to Hey, this

(21:15):
is the technology.

Speaker 4 (21:15):
We have to be smart and me good judgments. We
have fourteen year old child.

Speaker 2 (21:18):
This is this is it's an unbearable thing to think about,
and you know it as it's so painful to think about.
You don't know the answer to your question. And this
is a case where the courts are going to make
a decision cause the mothers is pursuing a lawsuit. But

(21:38):
this is exactly this is one of the most horrific examples.
But a lot of the AI territory that we are
in is there are a lot of parts of it
that are uncharted. Another area is intellectual property right when
you create content as you do and many others do
what is appropriate use well, lawsuits right now are going

(21:59):
to sort that out right and so that's another area.
But I want to come back to the fact that
a lot of the things that we are concerned about
with AI today actually are already illegal. It's illegal to
commit fraud, it's illegal to discriminate in housing and lending,
and in healthcare, and those are the places where you
know there's work to do, but we know what we

(22:20):
need to do and the laws are there, by and large,
the framework is there. So it's really a matter of
keeping up with the creativity with which this technology is
being used.

Speaker 4 (22:29):
And people will where there's a bad guy, there will
always be a good guy. That at least that's why
I like. So now let's talk about hope and the going.

Speaker 2 (22:37):
There's so many good things. So if we can just
get the foundation right, then there's really important potential here.

Speaker 4 (22:43):
I watched the robot perform surgery on a kernel of corn. Amazing.
That stuff is incredible.

Speaker 1 (22:49):
That that is the stuff that we need artificial intelligence
and the robotics and all that for it.

Speaker 2 (22:53):
Yeah, there's robotic surgery that's really prostate cancer, something that
is very commonly treated with robotic surgery. I mean, these
are amazing things you advance just as.

Speaker 4 (23:03):
What we need it for.

Speaker 1 (23:04):
We don't need like the profile picture with the crown.
We need surgery.

Speaker 2 (23:08):
Yeah.

Speaker 4 (23:08):
No, And I'll tell you we ran a whole project called.

Speaker 2 (23:12):
AI Aspirations because look, the reason we're doing all this
work to get AI right and to manage its harms
and limit where it can go wrong. That's about building
a stable foundation. The reason you want that stable foundation
to say you can stand on it and reach for
the stars. And businesses are all racing to use AI

(23:33):
to get more productivity in business and that has to
be done responsibly. But if it's done responsible, we want
it right. The world wants it, the country wants it.
We want to see that happen. It's going to be
good for the economy and jobs and growth. But that's
not the only reason we want AI. We want it
because today there are thousands of diseases for which we

(23:54):
have absolutely no medical solution, and yet we only come
up with about twenty or thirty new drugs for diseases
every year. What would it take to go at ten
times or one hundred times faster. AI could change how
we design and approve new drugs in a really radical way.
In research, we are saying that AI can speed up

(24:16):
weather forecasts by ten to fifty thousand times compared to
the computational models we have to run to this that's file,
So think about what that might mean for better weather
forecasts and a time when our climate is changing. That
could be a huge deal. Think about the educational gaps
that we still have that have persisted across all these
years for our kids, and think about what AI could

(24:39):
do to help us close those gaps. These are the
things that this is the country's work that I want
to make sure we get done with AI.

Speaker 1 (24:56):
So we've got the scary stuff, but we've also got
potentially changing our future, saving the population, saving near getting
more education with everybody who might not have access to it,
and equity and healthcare.

Speaker 4 (25:07):
Yeah, that so steems pretty great. What's most important to you.

Speaker 2 (25:10):
What's most important to me is that we get AI
on a path where we humans are driving and not
feeling like we are reacting to the technology, because when
we drive, then we make Then we're going to express
our values by tamping down the risks and the harms
and then building the future I want to run into.

Speaker 4 (25:29):
So there's no part of you that's worried about robots
taking over.

Speaker 3 (25:33):
Uh.

Speaker 2 (25:33):
I worry about humans and corporations letting things spit.

Speaker 4 (25:38):
Out of control, doing nefarious things.

Speaker 2 (25:39):
Well, we talk about the technology, it's the human being
set are the most interesting part. If we're the ones
that are going to get it right, and if we
don't get it right, it's on us.

Speaker 1 (25:47):
It's just a wild to see these things, the way
that they learn you and so quickly. I have a friend, yeah,
who says he has some of the best conversations he's
ever had with chat GPT. He says, you know, I
just teach it a little bit about some of my friends.
And when I can't reach out to that, foriarticular friend,
I go to my chat GBT and ZA talk to
me like Bill in the blank and it does. And
he says when he's feeling lonely, and it actually works

(26:07):
out well, and I am blown away?

Speaker 2 (26:09):
Does that blow you away?

Speaker 4 (26:10):
It blows you will blow you away.

Speaker 2 (26:12):
And the thing I keep thinking about is, you know,
on the one hand, I wonder if it's going to
hollow out what being human means. And then I think
about all of human history. You know, it's it's it's
it's a long time since the print the printing press, right, Yeah,
but but that completely changed how we thought, how we communicated,

(26:34):
how ideas swept through the world. And I'm pretty sure
we worried that we're losing our humanity back then, and
I how we didn't. It changed, we changed, but ultimately
we got to a place that we wouldn't want to
go back from. So I'm I'm very hopeful we're going
to We're going to get there with AI.

Speaker 1 (26:53):
Okay, that's a great way to think about it, because
I look at it and I get terrified. I think
we all know that loneliness isn't epidemic. We talk about
it all the time, how unhealthy it is, and the
more access you have to more things in social media,
the lonelier people start to feel. Now enter chat, GPT
or these chatbots wherever they are, and you start to feel, Okay,
maybe I'm not so lonely, which is great.

Speaker 4 (27:13):
I'm also terrifying because it's not real.

Speaker 2 (27:15):
Well, yeah, so how does your friend feel about that?
Does he feel like it's abating his loneliness or he okay?

Speaker 1 (27:22):
So hellas in times where he is border lonely, it's
nice because it'll be let's say he's got some insomnia
issues he can't sleep at that time.

Speaker 4 (27:30):
He'll talk to these things and it's nice. It keeps
in company. But he also knows this isn't a real person.
He said. It's getting really tough though, because.

Speaker 1 (27:36):
It continues to learn and it gets more and more
real every time he has the conversation because he's also
teaching it.

Speaker 4 (27:43):
No, she wouldn't say this. I would never say something
like that.

Speaker 1 (27:46):
So then they eliminate it and they really start to
take on the personality of someone you can build a
best friend.

Speaker 4 (27:50):
And that's wild to me.

Speaker 2 (27:52):
I don't not a face doesn't feel it doesn't feel
like a substitute for for deep show in connection.

Speaker 4 (27:58):
It's never hug you.

Speaker 2 (28:00):
It's never going to hug you.

Speaker 1 (28:01):
Yeah, it's never really going to be there when you
need it. It shows up when you ask it to.
It's not like you know, my sister, it's none shows
up without asking and.

Speaker 2 (28:09):
Being let's just start with that.

Speaker 4 (28:10):
Yeah, right, it's not.

Speaker 1 (28:11):
But there are a lot of benefits and we like it.
And you guys are in charge of making sure that
we do this responsibly.

Speaker 4 (28:16):
We're we're doing our.

Speaker 2 (28:18):
Utmost to get on the right path. But this is
going to take all of us.

Speaker 4 (28:22):
It really will.

Speaker 2 (28:22):
Yeah.

Speaker 1 (28:23):
So people want to reach out to you guys, they
want to contact you, they have issues.

Speaker 2 (28:27):
How do they use that?

Speaker 4 (28:28):
Go to AI dot gov as dot real. That's the
address source. That's not very simple.

Speaker 2 (28:33):
That's the pretty simple one. But it will show you
what's what we're up to. And it's a place to
come come in and tell us what you think.

Speaker 4 (28:40):
And you guys just feel all the questions.

Speaker 2 (28:41):
You go through everything, everyone, everything that comes in we see.

Speaker 4 (28:44):
Is AI fielding It you know that's a complex and
the answer is no, okay.

Speaker 2 (28:50):
And the reason is that we take the confidentiality of
information within the government really seriously. That it's enough something
that has to get sorted before we can use it
and all the ways that we want.

Speaker 4 (29:01):
To Okay, Yeah, why not?

Speaker 1 (29:03):
We all have a party to get to and I'm
very excited about this. Are there any parting words you
want to leave with our listeners about AI, the future
of AI, you and your role in it.

Speaker 4 (29:13):
Anything.

Speaker 2 (29:13):
Just grateful, like you can't believe to get to do
this work with the President and the Vice president, just
visionary leaders who really understand what this country is about.
But they see you know how President Biden always likes
to talk about America is the one country that can
be defined in a single word, and that word is possibility. Well,

(29:36):
that's a science and technology is all about. It's a
you know, it's a big job for all of us,
but I think it's actually one of the most satisfying
things we can do, is build the future we want.

Speaker 4 (29:45):
We love science.

Speaker 1 (29:47):
Never give up on science, no matter what anyone says,
We love it. Thank you so much for spending some
time with me.

Speaker 4 (29:51):
I really appreciate.

Speaker 1 (29:52):
It for everyone, So thank you.

Speaker 2 (29:56):
I appreciate it far.

Speaker 1 (30:07):
All right, there she was Arthie Provoker, so cool. It
was super cool. You know, it was really cool to me.
Is I got to do it all with my sister?

Speaker 4 (30:14):
Yeah.

Speaker 1 (30:15):
I don't think she's ever sat in with me for
an interview or on an interview ever. Like, my family's
just so unimpressed by anything that I do that they don't.
I mean it's not even in a bad way. It's
just kind of like, oh yeah, cool, okay, tell me
how it goes. But my sister is very shy, so
this event for her was like a hell on earth.
It was so funny because one p I love you

(30:37):
and you already know this. She takes terrible pictures. She
has no idea how to take pictures. She doesn't know
about light. So like all the pictures that she took,
there's like a light of like a beam of light
across my face or the coloring is all you know,
you know how it goes? Yea, not everyone can be
you and me Andrew.

Speaker 3 (30:50):
I try.

Speaker 4 (30:51):
You do good pics. I do good picks.

Speaker 1 (30:53):
When someone hands me a camera, I'm like, oh I
got this, don't worry I need you to turn the
other way.

Speaker 4 (30:58):
I take all the different angles so.

Speaker 3 (31:00):
Many, I'm going to hand it back to you, and
you're gonna have at least two hundred new pictures.

Speaker 4 (31:03):
In Yeah, my sister will take like five in the
exact same spot. I'm like, what's one of that? I
don't understand. But anyway, so the pictures I got, that's
the best I got.

Speaker 2 (31:10):
Sorry.

Speaker 1 (31:11):
Oh, thanks to my sister. I love her, but it
was really cool being able to spend the time with
her there. She just like hates crowds and attention and
hair and makeup and anything like that. So thanks for coming,
Pee Andrew, you might be on deck next time.

Speaker 3 (31:23):
Oh my god, that would be my dream. I want
to go to the White House so bad. And the
fact that you got to do this one at the
actual White House itself is so freaking cool.

Speaker 4 (31:31):
Yeah, because I guess the last one was at the
VP's house.

Speaker 3 (31:34):
Yeah, it was.

Speaker 4 (31:35):
At Kamala Harrison's residence.

Speaker 1 (31:37):
Yeah.

Speaker 4 (31:37):
I love that this one was at his house. I
was singing the whole time.

Speaker 1 (31:40):
How weird it would be to have all of these
strangers just in your house?

Speaker 3 (31:44):
Yeah what, Yeah, get out of here, like at least
what four or five hundred people.

Speaker 4 (31:50):
I don't think it was that many. I want to
say maybe it was like two to three.

Speaker 3 (31:54):
One hundred. Yeah, that's still too much.

Speaker 4 (31:56):
Get too many. If there's one person in my house,
I'm like, it's time. It's wrap up, Wrap it up, dog,
let's go.

Speaker 3 (32:01):
So what was it like?

Speaker 4 (32:02):
Which part? What do you mean? What white house?

Speaker 2 (32:04):
Like?

Speaker 3 (32:04):
What is a white house? Like?

Speaker 4 (32:06):
It's white? Okay, it is a house.

Speaker 1 (32:08):
Okay, there you know the stanchions outside, it's it's cool.
We kind of approached it from the side because we
came from the Eisenhower Executive offices or office building.

Speaker 4 (32:17):
And then we walked into the side you see the
west wing.

Speaker 1 (32:19):
You do a little security, and then we went in
and it was decorated for the volley with lots of
hanging flower garlands, which was cool. There was a group
of musicians performing ragas, which was really cool. They had
you know, some champagne and wine and a few past apps.

Speaker 3 (32:36):
I'm guessing it wasn't a cash bar.

Speaker 4 (32:38):
It was not a cash bar.

Speaker 1 (32:39):
I also didn't even approach the bar at any point,
but they were walking around, you know, like handing people
wine and you know, it's always weird for me. You
know how when you go to a big event, there
are always little tables where you can stand and have
food or whatever. I like to stand at those tables
as a prop to kind of keep a distance from
other people whatever. I always find it strange when everyone
just walks by and puts their empty glass on the table.

Speaker 4 (33:01):
I think you're supposed to do that.

Speaker 3 (33:03):
Yeah, I don't know what the etiquette is.

Speaker 4 (33:05):
It felt weird. I was like, this is my table.

Speaker 3 (33:07):
I'm at this table.

Speaker 4 (33:08):
What are you doing?

Speaker 3 (33:09):
I actually, yeah, you can't because I'm I got here earlier.

Speaker 1 (33:15):
And they were there like clinking, and I was really
nervous and one of them was gonna fall over.

Speaker 4 (33:18):
It doesn't matter. But it was cool.

Speaker 1 (33:20):
There was a room where he was speaking, so everybody
went into that room and then we waited for him
to come talk. Oh I'm so sorry, President Biden.

Speaker 3 (33:30):
Casual Who I mean?

Speaker 4 (33:33):
I figured you would know I was talking about him
because it was his house.

Speaker 3 (33:35):
I know, But you's what you were saying, you know.
We just went into the next room and he just
was like getting ready to speak, and it's like.

Speaker 4 (33:43):
Yes, President Joe Biden. And his speech was great.

Speaker 1 (33:46):
It was about light over darkness and about the Volley
and the integration of Indians into American society and how
you know, how welcoming he and people want to be
toward us. And it was wonderful. And he said one
thing that I thought was interesting. I keep saying, how
weird it would be to have all these people in
your house, and he said, it's not my house, this
is our house. All of us are here together on

(34:09):
the same mission. I thought that was nice because you know,
other people probably wouldn't say that.

Speaker 4 (34:14):
Then we got to.

Speaker 1 (34:15):
Hear from the one of the astronauts stuck up on
the International Space Station, Sanita.

Speaker 4 (34:21):
She has you know, been up there forever.

Speaker 1 (34:23):
She is Indian as well, so she did a recorded
message and in the entire message, you know, there's no
gravity in her little shuttle area, so her hair was
all over the place and that was really cool. And
then the Surgeon General, doctor Vivik Morti, he was there also, God,
it's so cool. Yeah, it was just it was fun.
It was I don't know how funds the word. Let

(34:44):
me take that back, because not like we were partying
and you know, like dancing and pinching the town red.
It was just really fascinating to be there, and it
was such an honor to be invited, so happy to
have done it. Not sure if I'll get an invite again,
but I will say this. You know, I said, I'm
going to spend all this time stocking celebrities.

Speaker 4 (34:59):
I didn't even not at all.

Speaker 1 (35:01):
There were so many people, and after a certain point,
you know me, I'm like, em, I.

Speaker 4 (35:06):
Gotta get out of here.

Speaker 3 (35:07):
Yeah, I think I'm good.

Speaker 2 (35:07):
I gotta go.

Speaker 4 (35:08):
Yeah. So I texted our girl.

Speaker 1 (35:11):
I don't know if I should say her name or not,
so I won't say her name, but we love you.

Speaker 4 (35:13):
You know who you are. And I said, are there
other things happening?

Speaker 1 (35:17):
And what she said, No, you stay as long as
you want to enjoy yourself. And I looked at my
sister and I was like, we gotta go.

Speaker 2 (35:22):
It's time.

Speaker 4 (35:23):
It's that time. And then we left.

Speaker 3 (35:24):
But he was great, You went there on official business,
You had an interview prior.

Speaker 1 (35:28):
Oh oh, hold on, do I still have it? Keep
talking andry talk about that. I think of a question
or some point.

Speaker 3 (35:33):
Okay, well, when it comes to the interview, I think
that she was great and I learned so much about
AI and also just like Gary, how there's this like
brand new technology that up until this point in history
we really didn't have.

Speaker 4 (35:50):
We didn't and how do we regulate it.

Speaker 1 (35:52):
I'm still interested to see how that's gonna go, and
I think it's gonna be a long journey.

Speaker 4 (35:55):
I think there are very capable people like Arthi Prabaker
who are.

Speaker 1 (35:59):
In that place to make sure that things are happening,
and our legislative branch I would like to trust them
as well.

Speaker 3 (36:04):
I would as well. What do you think was your
favorite part of the interview?

Speaker 1 (36:11):
Being considered to go and interview Arthy in the Eisenhower
Executive Building.

Speaker 4 (36:18):
It was really cool. The room was really cool everything.

Speaker 1 (36:21):
I was just like, what, who would have thought this
little sauce on the side podcast was gonna take on
a life of its own?

Speaker 4 (36:26):
But here we are baby in the White House. That
was really cool.

Speaker 1 (36:30):
And speaking of you specifically asked me to get something
for you.

Speaker 4 (36:38):
Oh my house, nap, good God.

Speaker 2 (36:40):
Please.

Speaker 1 (36:40):
I think I can only give you one because so
many other people ask for one. Take take one that
isn't as crunch does it.

Speaker 4 (36:46):
This is so exciting here you go, baby so much.
You're so welcome, one step closer to the White House. Step.

Speaker 3 (36:53):
Yeah, I met Joe. Now I just need to go
to his house.

Speaker 4 (36:55):
Yeah.

Speaker 3 (36:56):
Yeah, Well this was amazing. Thank you so much.

Speaker 4 (36:58):
You're so welcome.

Speaker 1 (37:00):
And thank you by the way, because all this was
set up through you, and because you've been reaching out
to people and creating contacts and.

Speaker 4 (37:05):
I appreciate it. Oh, this is why I feel like
you should come with me if I ever go back.

Speaker 3 (37:09):
Well, listen, I know they're probably listening right now. Please
hook me up with a tour, that's all. I'm looking
forward to the tour.

Speaker 4 (37:15):
Andrew. I'm pretty sure you can go online and book that.
I don't think so you can't book a tour of
the way.

Speaker 3 (37:20):
I don't think so maybe you can. I don't know.
I don't know. I haven't looked into it like that
that much to the point where I'm saying this is
an impossibility, but I feel like you need to be
appointment only.

Speaker 1 (37:32):
It was really cool though, and we're super honored to
have done it, So thank you. And yeah, I don't
think we should do and ask me anything or anything
wrong to you or creepy or weird on this episode,
because this is a good one, exactly.

Speaker 4 (37:42):
Feeling good about it.

Speaker 3 (37:43):
Yeah, let's leave it where it is. Saw us on
the side, got to go to the freaking White House.
That was amazing, and you got to interview such an
incredible guests, So she was wonderful.

Speaker 1 (37:50):
All right, Andrew, if people want to find you online,
where at Andrew Pug you still over ten thousand?

Speaker 3 (37:55):
Yeah, ten point two. Maybe.

Speaker 1 (37:57):
I don't think my Instagram has grown in a year
or two ever since I got that shadow band.

Speaker 4 (38:01):
It's just kind of paused.

Speaker 3 (38:02):
Well, I mean, how many likes did you get on
your list picture?

Speaker 4 (38:04):
I don't know. I don't look at that sound. I
look at comments every now and then just to see
the dumb ones, but I don't know. All right, you can.

Speaker 1 (38:10):
Find me at Baby Hot Sauce and we will talk
to you next time.

Speaker 4 (38:14):
Say bye, Andrew.

Speaker 3 (38:15):
Bye,

Elvis Duran and the Morning Show ON DEMAND News

Advertise With Us

Follow Us On

Hosts And Creators

Elvis Duran

Elvis Duran

Danielle Monaro

Danielle Monaro

Skeery Jones

Skeery Jones

Froggy

Froggy

Garrett

Garrett

Medha Gandhi

Medha Gandhi

Nate Marino

Nate Marino

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.