Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome to another episode of the Kronos Fusion Energy Podcast.
(00:15):
I'm Priyanka Ford, the founder of Kronos Fusion Energy, and today we have the privilege of
hosting Sushma Bhatia, our esteemed board advisor for environment and legislative affairs.
Sushma's career spans over 20 years with diverse experience across public service, consulting,
(00:39):
startups and leading technology firms.
In addition to Kronos Fusion Energy, Sushma also serves as the director of GTM sales tools
at Google, where she leads a team of product managers and engineers to develop state-of-the-art
(01:00):
sales and partnership tools.
Prior to this role, she was the global head of strategy and operations for payments and
next billion users, shaping Google's financial services partnership solutions program.
Additionally, Sushma is a board member for the state of California, appointed by Governor
(01:24):
Gavin Newsom to the California Board of Environmental Safety.
In this role, she provides strategic guidance and drives public engagement on environmental
safety topics.
Sushma's background also includes a successful tenure at Accenture, where she served as a
(01:48):
business strategy executive and chief of staff for the office of the CMT, Chief Technology
Officer.
Beyond her professional achievements, Sushma holds an MBA from UC Berkeley and a master's
degree in chemical engineering from the University of Southern California.
(02:11):
Today we'll explore Sushma's extensive experience and her insights into technology, sustainability
and fusion energy.
We'll also discuss her contributions to environmental safety and her vision for a cleaner, more
sustainable future.
I met Sushma on a train in San Francisco over a decade ago, and we became quick friends,
(02:38):
sharing common goals.
Get ready for an insightful and engaging conversation with Sushma Patia.
Here's my friend, Sushma.
So I'll tell you that from an angle, from the angle of environmental protection alone,
(02:59):
there's a lot we can do with AI.
For example, EIRs or environmental impact reports can take an average of five years.
Imagine if we could use AI ML, this is artificial intelligence and machine learning, to shorten
this time frame.
We could make quicker decisions related to development.
(03:24):
Some other use cases for AI ML could be to detect illegal landfills, monitor forest health,
monitor biodiversity and so on.
Of course, all of this assumes that we will have unlocked the challenge of how energy
intensive AI ML can be.
So Sushma, you work, you're an environmentalist and you work with AI and machine learning.
(03:51):
What is the connection between those two and how does that help society going forward in
the near term as well as long term?
I think there are a few different threads here to explore.
One is today we have a huge environmental crisis.
I think this is undeniable.
(04:11):
Most people will agree that there are climate impacts that are touching everyone on the
globe in different ways.
So we're at a moment in time where every solution available has to be fully explored and not
just explored in traditional sense of making incremental progress, but making step changes
(04:33):
in how we think about environmental health protection.
Those are two concepts that are very intertwined.
At the same time, we have a new revolutionary moment that AI presents.
It is promising and so promising in terms of the way it can impact all of us in a positive
(04:53):
way, driving productivity and so on.
We also know that AI is very energy intensive.
So how might we use AI in ways that can drive greater environmental and health protection?
How might we find more cleaner energy or cleaner ways to power AI to then ultimately build
(05:17):
solutions for us?
And that would be a very interesting use case to solve.
I was reading a study earlier this week that said that it was the energy usage of one Google
search versus one open AI prompt.
And it was going to be the AI prompt uses up 10 times more energy than a flat Google
(05:44):
search.
Then it depends on what you're prompting, I guess, like the complexity of your prompt.
But it would be-
That and then the models.
Exactly.
Yeah.
But it's really something.
And also then we talk about cryptocurrencies and the computing load of that.
Are you into that?
Are you into cryptocurrencies, by the way?
(06:04):
I am not, but my husband is.
He dabbles in them.
And I'm just acutely aware of all of the energy, how energy intensive even cryptocurrency is.
I previously worked in a municipality, I think you're aware.
(06:24):
And through my work in local government, I became aware of the health impacts from energy
choices that we make.
I was in roles where I would figure out what's coming out of smokestacks and what are the
chronic and acute health impacts of things coming out of smokestacks.
(06:45):
So this ultimately affects people.
Sometimes when we think about energy choices, they can seem somewhat removed.
But what brings this home for people is the connection that those energy choices are ultimately
creating impacts in our bodies.
We read about chemicals showing up in breast milk, for example.
(07:08):
Those are largely coming from fossil fuels or chemicals that are derived ultimately from
fossil fuels.
So making that connection for myself was how I became more interested in energy choices
and concepts of energy equity and creating policies or programs that allow consumers
(07:32):
to have choices about where the energy comes from as a way to protect their own health.
A lot of the energy production waste has gotten into food systems and water systems in the
past as well.
How do we measure these things and how do we mitigate this in the future?
(07:53):
How do we right the wrongs of our ancestors?
There is no single answer or solution, but I see this as a multitude of solutions.
But they all start with greater awareness and a bias for action across businesses, communities,
and regulators.
(08:15):
Let's take, for example, waste products that we throw away every day.
Now we know there is no such thing as a way.
And there are some specialized hazardous waste landfills for materials that continue to be
hazardous even after they're used or spent.
We have come a long way in deepening our regulations as well as our enforcement activity to address
(08:41):
these concerns that communities have, these communities that live by these landfills.
They're called fence line communities.
So similarly, let's look at biomonitoring.
So biomonitoring is a program where we are able to look at the information of chemicals
that accumulate in our bodies.
(09:03):
So in our bloodstreams, in our breast milk, and so on.
We are at a very early stage of using this data to inform policies and investments in
programs such as the Safer Consumer Products Program.
All of this is progress from the historic days, as you call it, when regulations were
(09:26):
not as strict as they are today.
The standards today are much more health protective and they continue to get better.
So to bring us back, I think there is a greater connection now that we know that what we consume
(09:46):
impacts our bodies.
The choices we make, including our energy choices, they impact our health.
And by consume, I don't just mean the food we eat.
You ask the question about how do we solve for this?
So ultimately, I'd like to say, vote with your dollar.
And there's been campaigns in the past that talks about, hey, putting the burden on the
(10:09):
consumers to make better choices, be more responsible, read the labels, and become scientists,
so on.
But I do think that there are more game changing, broader policy and programmatic decisions,
new technology solutions that can make step changes more quickly versus relying on 8 billion
(10:31):
people to make different choices.
So how might we accelerate the transition to clean energy portfolios?
I think that there have been some great examples of how, for example, these new policy choices
and incentives, we've moved the needle in terms of electric vehicle adoption.
How can we apply that model to drive the adoption of innovation?
(10:54):
Yeah.
So I think the more we automate some of these measurement processes, at least the more visibility
we'll have into the situation.
Would you say things are getting better over the last decade or worse?
I've heard someone use a phrase called apocalyptic optimist.
(11:19):
And I think I want to adopt that.
I think I'm both and depends on the moment in time, which part of the equation dominates.
And I think sometimes the apocalyptic view dominates and other times the optimist view
dominates.
I think today at this moment in time, I'm more of the optimist.
(11:40):
I'm an optimist because, yes, these problems have existed, but there's also greater awareness
and there is more solutions that are available to us, they're being developed and that we
just need to get behind them and move them forward as quickly as possible.
The fact that you and I are having this conversation and you have founded a nuclear fusion company
(12:03):
and I'm all behind it.
The fact that I 15 years ago or 20 years ago was one of the people that was vehemently
anti-nuclear because I had only seen the nuclear fission side of the world and was looking
at the portfolio of a municipality from that lens of how do we create a clean energy portfolio.
(12:25):
And I'm really struggling to understand why nuclear energy was logged as a cleaner energy
aspect of the portfolio.
So I've made that switch in that last decade and now recognizing that this is a real possibility.
That is the optimist part of me.
While that's happening, there's also things around safer consumer products.
(12:47):
There's a whole movement around creating brands that bring cleaner, safer ingredients and
there's greater adoption of those.
There's some third party labels that allow consumers to make choices that are safer for
them.
So all of this is the optimist side of me.
I recognize there's a lot to do.
In general, there are smart people working on the solutions.
(13:11):
Yeah, I'm definitely hopeful as well.
I feel there's one way to look at fusion in that it's going to help the larger things
that you want.
So if you look at Maslow's pyramid of needs or something, there's enabling AI and cryptocurrencies
(13:35):
and industrial heating and all of that.
And then there are parts of it where you're like, there are almost non-monetizable parts
of it where you would have cleaner oceans, just less waste going into things, just the
lack of radiation.
I feel if there is this much amount of damage we can do in 150 years of industrial revolution,
(14:01):
there's a lot that can be undone in a hundred years of whatever the opposite of that looks
like.
And I don't know what the opposite of that looks like, but I would like for fusion to
be a part of that.
I imagine AI is a huge part of it.
And as you said, just knowledge of what's even going into our bodies is important, just
(14:27):
as important as enabling cryptocurrency.
It's also a very, very important thing, but there isn't a monetary profit attached to
it.
So during your switch, I definitely am curious to know about your anti-nuclear to pro-nuclear
(14:49):
to pro-fusion anyway shift.
What was that like?
What was the thought process there?
It's two things.
One is completely just looking at the science, the science aspect of fission versus fusion
and educating myself about how they're different.
And the second thing is this moment in time.
(15:13):
I keep talking about this moment in time because I think we've spoken about we are so close
depending on who you speak to, we either reach a tipping point or we've crossed it or we
are just about to reach it.
So that forces the conversation of what can we do now to create that step change.
(15:35):
And this fusion presents, first of all, from the science aspect, it is clearly different
from fission.
It is the offers the only option for clean, limitless clean energy as we spoke about.
Yes, I think solar is a part of the mix and hydro is a part of the mix and all of these
other clean energy sources are part of the mix.
(15:57):
But the potential of fusion is so massive that if we can unlock it, it can create that
step change very quickly for us.
So I think to answer your question, there's two things.
There's the science aspect, which is really digging in and understanding that fusion is
very different from fission and that my point of views in the past were tied to fission
(16:20):
and also the aspect of radioactive based.
As you know, I have this bend of what are the health impacts from the choices we make
and the choices we make as a society, not as individuals.
And recognizing that fusion actually doesn't create those long lasting radioactive base
like nuclear fission does.
(16:41):
So the science aspect and then the second thing is the moment in time aspect.
This technology has seen many breakthroughs in the recent days as early as the last two
weeks.
So it might be just get as a society, get that energy behind it and drive it to commercialization.
Yeah, I agree with that.
(17:02):
Fission always scared me.
The reason it scared me is because the advent of it brought about the invention of so many
weaponizable things.
And especially like watching, did you watch Oppenheimer, by the way?
I did.
He did.
Absolutely.
He wanted to do fusion.
The dude was all in on fusion.
(17:22):
He wanted to do fusion and then he got distracted, I suppose, to say the least.
Yes.
And then if you talk about the concept, I talked earlier about the concept of there's
no such thing as a way.
So when you talk about radioactive waste, where are you going to store that?
There is no away.
It ends up being stored in the soil somewhere and there's spent like communities there.
(17:46):
There's a risk of meltdowns or natural disasters exposing them.
That risk continues to remain for many, many, many hundreds of years.
And that is my concern.
It's like you're solving one problem with another.
Yeah, I mean, tens of thousands of years, Sushmabu.
I mean, do we even know if there'll be human beings around that will remember to take care
(18:10):
of this?
You know, like, how do things translate?
It scares me too.
It scares me too.
We could undo millions of years of global evolution just by a couple of accidents here
and there with fission.
And that is not sustainable, just no matter how you look at that.
(18:36):
As you know, I work as part of a government board.
It's called the Board of Environmental Safety of California.
And in this role, I often think about how might we reduce the generation of hazardous
waste in California and how might we manage it safely within our state.
(18:56):
And I've also become aware of how much has changed, not only in terms of our awareness,
but also in terms of our regulatory response.
In the past, it may have been legal to dig up the land and store unwanted lead paint
cans there.
It was also legal to develop that land with parks and communities and such.
(19:20):
This was assumed to be safe when we were not aware that lead was, in fact, really toxic
and that there was no safe level of lead in our bodies.
We were also not aware that the movement of earth could perhaps cause exposures to that
lead for future generations.
When we draw the parallel to spent nuclear waste or radioactive waste, I feel it is exponentially
(19:48):
scarier to imagine how we might inadvertently create risk of exposures for future generations.
So in my mind, it's not sustainable.
It's not part of the sustainable solution for us.
However, fusion is so different from it.
And that creates the optimism.
Right.
Of course.
(20:09):
But also, from my perspective, I like it in terms of large scale commercialization.
If we're going to enable all of these technologies in the future, everything is plugged into
some sort of power somewhere.
Have you heard of the incident where they found barrels of chemicals off the coast of
California, like out where I live?
(20:31):
Yes.
Yes.
What was that?
I can't remember.
What was that?
It was from decades ago.
I think well before regulations even existed.
Today we've come a long way as we discussed, both in terms of addressing all this legacy
pollution, but also in terms of creating more health protective regulations and programs
(20:54):
for facilities that continue to operate in the state.
Yeah, I often think about that with fusion being almost a clean slate a little bit.
In terms of energy, what can be done in order to avoid some of the pitfalls, as you said,
(21:19):
that other energy sources have caused this inequity?
How do we avoid some of that when it comes to fusion?
Is this something like governments need to prioritize certain communities over others?
Is that even pragmatic if you want to have a successful country?
(21:45):
What do you feel about that?
Two things.
One is if you look at what has happened with electric vehicle adoption, if you try and
look at the statistics for where the electric vehicle charges are installed, like where
is that infrastructure?
You will find that there's fewer or there's less infrastructure for electric vehicle charging
(22:06):
in those what I call traditionally environmental justice communities.
Why is that?
Clearly that's because there's some sort of disincentive or like the incentive is not
quite there to drive adoption.
And how can governments create programs, and I think this is already happening at the federal
level, to make sure that that infrastructure is equitably available in environmental justice
(22:30):
communities?
So that's one.
The second thing which I think will happen by default as clean energy gets adopted is,
I'm sure in your lifetime you've been past a refinery.
There's refineries that exist today and there's communities that live by refineries.
And those end up being those environmental justice communities.
They see higher burdens of pollution.
(22:51):
But in a world where we have distributed clean energy options, there's less of those refineries
living in communities or like less communities living by refineries that are spewing out
air pollutants.
So I think that becomes a great equalizer where you take out one aspect of what creates
the environmental burdens in these communities, which is dirty energy.
(23:15):
And I think I want to call out that distribution aspect of things, because the distribution
itself with all of the trucks moving back and forth, that creates a whole other level
of pollution.
It's another lever that creates pollution.
So if we have distributed clean energy, there's less transportation, there's less emissions
coming out of the smokestacks or there's no smokestacks, plus there's less trucks moving
(23:37):
back and forth, moving the energy source back and forth.
Yeah, when I was at COP 28 in December, I talked about using fusion energy industrial
heat for processing liquid fuels.
And I see I've driven by a lot of these communities.
(23:58):
And I think it's not so much the liquid, it's not so much the oil itself.
It's like the processing of the oil is like doing more damage to the environment than
the exhaust from your car after you put it in your car.
They're both harmful, but the processing part is so dirty in so many levels.
(24:21):
And we feel like that's something that can be cleaned up with fusion.
And also, petroleum's not going anywhere.
I'm sitting in my room and I know half the things here are made out of plastic and that's
pretty much petroleum.
Yes, absolutely.
Which is why we're agreeing that it's, I think, we solve not all the problems with clean energy,
(24:45):
but we solve a chunk of it, a significant chunk of it in terms of the immediate climate
challenges we're facing.
Plus we solve an aspect of the health burdens and pollution that communities are experiencing.
We solve these challenges.
And then there's more, which with targeted policy and programs, we can solve for, like
(25:07):
the use of plastic, for example.
Right.
Right.
And I think even when we're talking about economically disadvantaged, we're looking
at it from an America perspective because we're here.
But it's also, I mean, economic disadvantage means something entirely different in poorer
countries where there are countries that don't have electricity for 12 hours a day and things
(25:37):
like that.
And you think about what that does economically to what they can manufacture, to the education
of their children.
And I was born in India and so I kind of understand having a light bulb in your house makes a
huge difference, just that one thing.
(25:57):
Absolutely.
Absolutely.
The scale of the problem we're talking about is depending on where you are, it is so much
can be so much more amplified.
And the sense of urgency can be so much more amplified, like the flooding in Pakistan,
like we've heard about, it's a historical level of flooding.
But I think this is where I talked about the burden is not shared, the responsibility is
(26:20):
shared comes in is the burdens are not distributed equally.
And this is why the urgency that the moment calls for is maybe not felt equally across
the globe.
I know that in terms of regulatory stuff, that it's a little bit positive here in that
(26:42):
fusion is going to be regulated differently than fission and that America is making it
easier for fusion energy startups to kind of grow and build and flourish.
But how do we translate that globally?
You've lived in a lot of places globally.
What do you think?
Change can happen in my perspective in two ways.
(27:05):
And I do think that regulations and government agencies will have very significant role to
play in creating a whole system.
I think that actually innovation and technology, that in itself is the other creator of change.
And I'm saying this because if you look at AI, for example, now, you called out how energy
(27:30):
intensive a search on a large language model can be relative to a traditional search.
That is true.
And as AI, the use of AI accelerates and intensifies, these technology providers are going to be
asking about cleaner options for themselves.
(27:51):
And because they're going to have to report to their shareholders in terms of some level
of ESG like governance, what is their mix of energy use, they're going to start to look
for a cleaner energy solution.
So I think that is they're both equally important.
In fact, if ideally they would both be happening in parallel to create that impetus and the
(28:14):
energy that's required to more propel this forward.
Yeah, I think about how difficult it is to for a technology like fission to have global
commercialization because of the regulation for it.
(28:36):
And just like the weaponizable qualities of it and the environmental impact, all of those
things, of course.
So we have to we almost have these like global regulatory committees that stop the building
of fission plants in certain countries and whatnot.
And I feel like all of that, we can avoid all of that with fusion.
(28:59):
So globalization would be a little bit easier if but it.
Yeah, I don't know which which way the regulation will go globally with this.
Yes, I mean, typically regulation comes in when there is any potential for harm.
There's aside from a license to do business in a country, there's aspects of energy, water,
(29:26):
wastewater, social impacts that that business can create.
And this is why most governments run something like something called EIR process or EIR reporting,
which is an environmental impact risk assessment to figure out is it OK to cite this kind of
a business within within their communities and what sort of impacts would that create,
(29:48):
is that acceptable risk and so on.
But and there's clearly established impacts when it comes to fission.
But fusion is completely different than I would imagine that there's many governments
haven't even quite figured out how do you do even regulate fusion.
And is there I think in order to answer that question, like some some of what is required
(30:10):
is education in educating the regulatory agencies, educating the communities.
Hey, not all not all nuclear energy is the same.
This is different.
And doing that often more frequently in the early stage, like now before it starts to
get commercialized or actually help commercialize it more and bring it to market more quickly.
Yeah, get ahead of it.
(30:31):
Yes, absolutely.
You can.
I know.
There is a breakthrough in in Google where you are with like the material sciences.
We talked about this a few weeks ago where, you know, basically used an EI to have to
have material to build material compositions.
(30:51):
Why haven't we used a eye to end poverty?
Sushma, like why haven't we tuned in a high to basically give us all the answers to end
poverty?
Would we listen to that?
Maybe I will have to achieve what they call AGI, Artificial General Intelligence.
And at that point or well before that point, maybe government agencies will have systems
(31:16):
in place to disallow the development of AI to the degree that it can give us that level
of intelligence, because what if one of the solutions is like, let me just end humanity.
Like in this humanity, there's no need for housing.
There's no need for poverty.
Like all of these become become a point.
And joking and getting ahead of myself.
(31:37):
I think we're very the AI race is very nascent.
I think many questions have not quite been answered.
Actually the because I live in Silicon Valley, I hear the word AI a million times a day,
but there are parts of the world that have not heard it.
Right.
Like there are parts of the world that don't know what shy GPT is or Gemini is or so on.
(32:00):
So I'd say they're very, very early in the race and there's a lot more development that
is already happening and that's going to continue to happen.
I do think that as the development is happening, there is development of regulations happening
in Europe as traditionally been the outsourcer of regulations, as I call it.
A lot of the regulatory frameworks that in Europe or France, Germany, and so on.
(32:24):
So we see them again, taking the front seat on regulating AI.
And I do see many, many, they are doing this and I see many other countries.
So I think all of these things are going to shape up, shape out.
I think what would be really great is governments are typically slow to adopt it, but it would
be great for governments to start using AI to drive the greater good.
(32:47):
Like I mentioned this example of the AIR, they typically take five or 10 years.
Wouldn't it be great if we could use AI to accelerate the timelines and make development
decisions more quickly and understand health and social impacts and make better choices?
So all of those things, I think, can happen now.
Yeah.
Yeah, that is the problem.
(33:08):
Humanity is not the AI, I guess.
That's weird.
Yeah.
Yeah.
So, oh no.
So the best way to have an energy efficient house is not have a human in it.
That's what we're saying.
Yes, at this stage, all of us.
At this stage, yeah, it's too bad.
Maybe we'll make ourselves useful to the AI overlords and they'll decide not to destroy
(33:35):
us or we'll have singularity and we'll all become one.
We'll need a power system.
Neuralink, something like Neuralink.
We'll all come together.
There you go.
Yeah.
I see amazing things with bionic arms and legs.
Seeing bionic arms that make you want to chop your arm off.
(33:58):
We should probably cut that one out.
Yeah.
Yeah.
So let's talk about ethical considerations.
How do we, how do we, there are clean, even with fusion, there are clean ways to do fusion
(34:18):
and there are dirty ways to do fusion.
And there are a lot of ways where the way that humanity decides to commercialize fusion,
there are many ways we could do it where it almost offsets the good and clean energy it's
doing it by having like a really dirty supply chain.
(34:39):
Do you see what I'm saying?
Like so there has to be a way to keep it pure.
How can we, how can we ensure that?
So the example, one example of what we just talked about is like an electric vehicle.
And oftentimes this question has come up about are electric vehicles truly clean?
If you look at the life cycle of electric vehicles, all of the energy it takes to produce
(35:03):
the vehicle might sometimes be greater than the lifetime emissions that one vehicle might
produce.
So I think this is sort of a societal question, but a great question for governments to answer,
which is with any technology, any workflow, any process, there's a spectrum of ways to
do it.
(35:23):
And if I look at the parallel of how we handle those today, it's through regulation.
We basically have situations where any business in California, for example, would have to
get a permit to operate, especially if it is using hazardous materials and generating
hazardous waste.
So on a case by case basis, the regulators would go in and say, hey, like I, for this
(35:46):
particular workflow, there's cleaner ways to do this.
I'm going to require that you use the cleanest ways to do this.
And they would include those requirements in a permit condition within the permits to
say this is the only way you can operate in California.
And of course, this creates some level of inequities from state to state because a facility
(36:06):
would have different set of requirements they would have to meet if they were to operate
in California versus Nevada versus Arizona and so on.
But we at a principles level in California that said, we uphold the highest levels of
environmental and health protection.
And that means that's going to translate to certain requirements for you.
And that is a cost of business, cost of business in California.
(36:27):
So drawing that parallel in your world, I think in the world of fusion, I do see regulatory
agencies stepping up and creating different set of requirements to ultimately move them
along the spectrum of the cleanest nuclear fusion options available.
And this might take time because this requires education.
(36:47):
So if you think about the kinds of permits and permit conditions that were written 10
years ago, very different from what is written into a permit condition or a permit today.
Because agencies have learned what that, what cleanest way of functioning or working is.
So that might happen in the future in order to where some what you call dirtier forms
(37:08):
of nuclear fusion come to market, then the regulatory agencies learn and then sort of
move them along the spectrum over time.
Yeah, there are zoning issues now.
Like if you have, I think if you have any system whatsoever that produces neutrons,
which all fission systems do, they have to be in a certain location away from XYZ, like
(37:33):
a certain civilization or whatnot.
But I think that basically any experiment, any energy production, anything that produces
neutrons has to be outside of, of, of busy large cities, which is why I think when I
first started with the fusion energy and we were thinking about where to build our facility,
(37:59):
somebody asked me, they said, are you the type of, are you the type of fusion that can
even build in California?
And I didn't understand that question.
And I looked into it and it turns out that if you have a neutron producing fusion energy
system, so even if you are Lawrence Livermore National Lab or Berkeley or, or, you know,
(38:21):
or UCLA, if you have, or General, General Atomic Dynamics or General Atomic, sorry,
is in California as well, any of these fusion energy experiments that have neutrons have
to be far away and anything, and we're yet to have an experiment that does not produce
neutrons.
(38:42):
So we would be like one of the first ones.
But I feel like California has a vague idea, but I don't, but you're right.
I don't think the knowledge exists.
And what I fear is that I don't want to inadvertently make the same mistakes that the people that
dumped off the California coast, those barrels made where they didn't do enough research
(39:08):
to know the impact of their waste over the course of the next four decades.
And now we're sitting here scratching our heads as how could we allow this to happen?
Yes.
It's, it's growing to me that that even happened.
And I would even say like to your question point about zoning laws, they exist today
(39:30):
because we've learned from the past.
There were some meltdowns and health impacts and so on.
And then those we learned as a society and created zoning laws and so on.
I can also tell you, are they good enough today?
That's questionable.
Like there are situations where you'll find what looks like industrial polluters that
are next to communities because the zoning allows for it.
(39:51):
So whatever we, what we thought was good enough 10, 15 years ago is no longer good enough,
which means we have to keep getting better in the way we regulate, in the technologies
we use.
And the regulations just create a way to push people along that spectrum of the cleanest,
(40:11):
most environmentally friendly, most health protective option that it was.
Does regulation get in the way of economic progress, Sushma?
And if so, is there a number where it's worth the price?
Like it's just kind of collateral damage where money is more important, but then there's
(40:32):
a point where no matter how much money it's not worth doing the damage or something like
that.
Where is that threshold?
And should a government have to enforce such a thing?
I think you're asking really important, beside to problem questions.
(40:56):
I from my perspective believe that regulations are important and have a role to play in a
capitalist society where businesses, companies are optimizing for shareholder value and their
bottom line, choosing options that are not as health protective or create pollution in
(41:16):
the environment might create or generate more profits for them.
So regulations do have a role to play where we're collectively moving society towards
cleaner options and leveling the playing field to some degree so that that burden is equally
shared and everybody has to move along.
Having said that, it is a really difficult problem to solve.
(41:40):
And I will tell you from, I was at a board meeting yesterday, there are many communities
very upset at the pace of government that it is not protective enough.
And there's equally equal number of upset permit holders and businesses that feel that
government is too health protective and imposing too many requirements.
(42:04):
So I'd say, I joke about this, I often say everybody hates us equally, we're getting
it right.
And it translates from something and that joke aside, there are some principled ways
of doing this to make this equitable.
For example, we think about something called cancer risk and there's a number to it.
(42:26):
So we say, as a society, we've said, hey, if the cancer risk is one in a million, that
is acceptable.
If a new process, a new permit, something we're creating is going to create a risk
of one person getting cancer amongst a million people, that is acceptable.
But everything beyond that is not acceptable.
So then we require best available control technologies and so on to reduce that cancer.
(42:49):
So I'm just giving you one example, but there's so many other examples like this of numbers
we use to help make that decision and trend that balance carefully between economic benefit
versus environmental health protection.
And I actually like to believe that there is no trade off.
This is an artificial trade off.
Ultimately, it is this trade off doesn't actually exist if we can get it right.
(43:14):
Right.
So there's a gauge and there's a calculation of risk for all of these technologies.
But the fact that we even need to have it makes it that it's not a win.
Everything is a loss.
But one is lesser of a loss than the other.
(43:36):
In this situation where we're not talking about newer technologies like AI, we don't
have that calculus built out for AI yet.
And that's why we earlier spoke about how there's government agencies looking at AI
from the perspective of where do we play, how do we play, we want innovation, but at
the same time, we don't want to fit all of these downstream impacts from this technology,
(43:58):
like potential risk of AGI and so on.
So there are these unanswered questions and there's new technologies that come up, but
where we've had legacy answers from processes that exist, we use regulations to move folks
along the spectrum.
What is the risk of AGI?
So is AGI just the fact that AI has a conscience, like it has decision making without actually
(44:23):
us having to give it a decision based on a logic?
Is that what it is?
Is it just like, it's gone rogue and now it has a mind of its own and we fear that?
Are we to fear that?
So it's something called alignment problem, which is in theory, if the AI is working to
(44:46):
solve for us, so we are completely aligned, then it makes sense for the AI to produce
whatever it's producing with understanding that we have the same definition of a set
metric.
But if there's no alignment, then that creates a problem for us because then AI could make
(45:07):
decisions that are not aligned in terms of what we would like to achieve with that answer.
Oh, Sushma, but human beings make decisions against their own wealth.
All the time.
All the time.
All the time.
I think that's made like for today, that was probably just not the best thing for me.
(45:27):
Yeah.
Yes.
And this is where I worry that part of me worries that this AI world is going to create
8 billion versions of truth because I'm only going to, my alignment or my focus is to get
content that concerns my bias already.
(45:50):
So the content that's reinforced in my confirmation bias.
So if we have that happening 8 billion times over, that's like every single person walking
around with their own version of truth.
And what does that mean for other than society?
No, that scares me a little bit.
But I'm happy to, it scares me a little bit.
(46:10):
Be like, wow, we cannot as a society agree on fact because we think 8 billion versions
of fact agree.
See, I'm thinking that the only reason we have all of these issues is because we don't
have AGI yet.
I feel like humanity is what is messing up, is what is feeding the bias.
(46:38):
I think conceptually with AGI, that bias itself would be removed and it would just kind of
tell us what's best.
Yes.
So right now, like this whole thing.
Yeah, that's exactly right.
Like, so this, you know, you may have heard about this, but how AI is hallucinating or
(47:00):
like telling people.
No, I haven't.
How are you happy?
What is the thing?
So just for the LLMs that are the generative AI solutions out in the market, if you test
them and various journalists have tested them to various degrees and they start to hallucinate,
I think there was one New York Times reporter that was asked to read his lines and there's
(47:20):
a whole bunch of biases that the current generative AI tools are starting to demonstrate.
And the whole concept behind that is when someone puts in a prompt, the existing AI
solutions are transforming the prompt so that they add more color to the prompt that a user
(47:41):
is providing.
So ultimately to, in an attempt to give better responses of the AI generative response, they're
transforming the prompt that a user gives.
And in some cases, what happened recently was the prompt transformations were in the
intent of transforming them to create more inclusive solutions or responses that might
(48:07):
be more inclusive.
The transformations were done behind the scenes to the point that when someone asked AI for
a picture of the founding fathers, the image that the AI generated was a bunch of brown
people.
Right.
So it's things like that.
But all of that just gets to your point.
There's such a high degree of human intervention in the AI development process that it's bias
(48:33):
in bias out.
Right.
And that bias went the other way a few years ago in that it was only able to recognize
white male faces.
Yes.
Correct.
Especially with like, my goodness, especially with facial recognition with the companies
and things like that, there were issues.
(48:54):
And now it's kind of so nobody's nobody is getting away.
I think everybody is affected by it one way or the other, no matter who you are.
Interesting.
Yes, a lot of problems for us to solve.
I'd say like bringing this back full circle, like if I actually want to solve the AI problem,
which is not a problem, but essentially like develop this technology, put it in the hands
(49:17):
of eight billionish people on earth, that's going to require a lot of energy.
It's going to create a lot of productivity and it's going to really improve people's
quality of life.
So I fundamentally do that.
But then it's very energy intensive.
So how could we solve that now versus creating even more health impacts from dirty energy
and having and like flooding our health care systems that are already pretty burdened?
(49:40):
Yeah, it's all connected.
But it always it all comes down to energy.
I mean, sometimes I think maybe that's me because that's my world.
But but it's the truth.
Everything comes down to energy and how we use it.
Yeah.
How do we get how do we get young people who are unhappy and not on?
(50:01):
And that that's bothering me.
I really this reading that earlier this week, it really bothered me.
They're all unhappy because they don't see hope in the future.
So it's why you you're on the cutting edge of a lot of things here in terms of AI, machine
learning, fusion energy.
How do we get how do we get these guys to be a little happier and participate and be
(50:22):
hopeful?
So I don't actually see the lack of optimism in youth as much.
And this might be the getting to the apocalyptic optimism part of me, which is I'm more of
an optimist on this side, because I'll just tell you from the example yesterday.
Yesterday's board meeting, I had maybe over a dozen UC data students show up and talk
(50:44):
about how the issue you thought we were discussing were important to them.
And that just made.
Firstly, for me, my heart sank.
It's like this is what we need.
We need you to be more engaged, you to show up and have their voices be heard.
And that matters.
So I do think that there's an awareness gap.
(51:07):
I just if we can cross that awareness gap and get drive more awareness with you, they
will be more engaged is my my optimism is is that it's an information problem.
And of course, information problem is a very hard one to solve in the world of tic tocs
and Instagrams and so on that that drive their attention away from real issues like energy
(51:29):
and climate and so on.
So how might we may perhaps even using these platforms drive awareness and get them engaged.
Any thoughts on Tic Toc ban?
Do you think that will actually happen?
So there's a lot of hurdles before my personal point of getting there's a lot of hurdles
before it can it can pass like it passed the House vote at their Senate and then the president
(51:53):
has signed it like those both.
There's two different steps in the process, but this bill on the table can fail at.
But there's also the issue of like, if the bill were to be passed and signed into law,
then like, how do we decide who the new owner of Tic Toc can be in the US?
Like, what does to say that whoever decides to reject the investor group decides to buy
(52:19):
Tic Toc is not biased in some way or like doesn't have interest in the Chinese government
in China and so on that can meet the principles of what this ban is about.
I think that hasn't quite been solved.
I think on paper it says the president must approve who this new entity, whoever the new
(52:40):
entity is that ends up owning Tic Toc in the US.
But is that good enough?
Like do we think that the president one person alone will have this power to decide?
There's a lot of theories like depending on which presidential candidate wins, do we think
that they will make an unbiased decision?
(53:02):
So I think and also there's another question now polling, right?
How is Biden versus Trump?
How are they doing with the polls in respect to youth?
And youth are the ones that are using Tic Toc.
So will the presidential candidates feel like they're going to be penalized by voting for
or against the ban?
(53:22):
And how does that factor into the decision?
Yeah, I don't have to be glad I don't have to make this decision to tell you.
Yeah, me too.
Me too.
I don't envy people that have to.
But I think it's like such a fundamentally for me, it's like a freedom.
It's just like a freedom of speech, like freedom of knowledge thing.
Like I should be able to learn whatever I want to.
(53:45):
And if I'm watching the propaganda video, I should be allowed to watch the propaganda
video and decide whether or not I get propaganda.
I guess, but I don't know.
It's more subliminal than all that.
And I understand.
But I feel like I don't think anything should be censored whatsoever.
(54:05):
So I think it's just different.
But I feel differently about how my daughter, I want a lot of things censored for her.
There is.
There you go.
You know, you don't.
I mean, that's true for society.
Like I don't want something being imposed on me, but I want to be able to impose some
sort of control mechanism on someone else.
(54:29):
Well, I would never.
I don't care about what an adult human being does with their life, but I don't want my
11 year old, you know, just, I don't know.
I don't want her.
I don't think that she has the ability to decipher what is propaganda and what is not.
I only think that I know, but maybe I don't either.
(54:49):
So yeah, I don't know.
Yeah, glad I don't have to decide.
You're right.
But I like my tick tock.
I will miss it.
I like for what it is.
What I don't think is going away.
Whatever the bigger looks like, I don't know.
I think the face of it might change.
(55:10):
Yes.
Yeah.
It's just going to make it boring.
Something will change, but it's hard for me to imagine that it's you just doing this in
the US.
Yeah.
I'm just going to chase the algorithm to where all the fun conspiracies are gone.
That's too bad.
Kind of liking watching the lizard people videos.
(55:32):
Yeah.
Well, hurry up then Priyanka, before that happens, make really solid nuclear fusion
viral content for youth to consume so that they can show up government agencies and found
a table to get fusion more incentive money to flow into fusion so they can get commercialized
(55:53):
sooner.
Oh man.
I don't know.
It's hard to make energy interesting.
It's just about the most boring topic ever.
People don't understand.
It's so fundamental to our reality that it's like, I don't know.
You could tell young people that we can build a lot of things with it, clean the planet,
(56:13):
solve a lot of things.
Or fundamentally when you start talking about energy and oil and steel and cement, and
these are all not exciting topics, but it's what we're all built on.
So I don't know.
I am not an exciting person.
I would not know how to make things exciting for young people, but we should make TikTok
(56:37):
videos.
You should make them Sushma.
Actually on that topic, there are some nonprofits that work with you to get them engaged on
solving social and environmental challenges.
One I was associated with called Fishbowl Challenge does this.
You globally get together in different cohorts and solve for a problem that they're passionate
(57:01):
about.
So I just wish that that could be on steroids and would drive more engagement to solve for
what you and I just spoke about, which is getting more youth engaged because we ultimately
need them to get us out of this fossil fuel system we're in, that sort of new energy to
(57:21):
create that change.
Yeah.
And younger people now are more knowledgeable than even I was when I was in my teens.
My parents definitely had no idea what was going on with all this when they were 16,
17.
Today, 16, 17 year olds, they understand more than we did at that age.
(57:44):
That does give you a lot of hope.
And you're right.
We have to appeal to the things that they care about.
Yeah.
And energy is invisible.
You're right.
Energy is invisible.
So how do you make it visible so that they can then engage with it?
Right.
Yeah.
That's the perfect way to say it.
(58:05):
Yeah.
And it's not exciting.
It's not exciting.
Things like an energy power plant takes like a decade to build.
That's not like an exciting new app or like a fashionable new like ripped jeans or something.
Like it's just not.
This is where I think there's an opportunity in those like environmental justice areas
(58:28):
to do something that's related to clean energy, maybe bringing clean energy there first because
in those communities, what I've seen is the youth are very engaged because they experience
the burdens firsthand.
That propels them into action.
So maybe there's a world where the clean energy solutions for a change go to those communities
(58:49):
first.
Did you, where did you grow up Sushma?
Where were you born?
I was born in India.
I grew up between the Middle East, Singapore and then India.
Did your upbringing lead you to be involved in environmental justice then?
Like when you were young, did you see environmental injustices where you grew up?
(59:15):
I saw them, but honestly, I didn't have the vocabulary then to understand that they were
environmental injustices or energy not good views.
I think that vocabulary came now, like in the last decade.
So yes, I had asthma as a child growing up when I was my parents lived right next to
a freeway, but now I would never do that for my kids because I know about the health impacts
(59:39):
that can cause, but that's where they lived.
And I also experienced conflict, which I fundamentally believed is conflict, like energy based conflict
during the Gulf War when Iraq invaded Kuwait.
One of the first things Iraq did was to set the oil wells on fire so that there wouldn't
be access to energy within Kuwait or Delhi.
(01:00:02):
It was a way to control the government of Kuwait.
And so I experienced that as well, which then brings me to where I am today and why I'm
passionate about the issues that I work on.
Yeah.
I also think about not just like the cleanliness of fusion energy, but like the things that
(01:00:26):
have affected peace, shall we say, historically because of energy has been because like there
are deposits of oil in specific places on this planet.
And so there is like, that's not just that's environment doing the people injustice, I
guess.
I don't know how that quite works, but it's not something that we had a role in as people
(01:00:50):
that's just hitting the lottery or you don't.
And that's caused people to fight over land and all of this.
And then you think about fusion and all of the raw materials that go into fusion can
be made in a lab just about anywhere in the world.
And so there is like an equalizer there.
(01:01:11):
But then there's also a possibility where one country can just make more of the hydrogen
isotopes or it has more access to go to the moon and come back.
And so now I think part of the space race that we see between a few countries now also
has to do with getting helium three from the moon, getting other resources from the moon
(01:01:32):
and being space faring.
And that's like the next frontier for resources.
But I feel like at least in terms of world peace, we don't have this like specific deposits
here and there that dictate who has power over who.
(01:01:55):
Yeah.
I always think about water, water and desalination of water.
In the Middle East when you were growing up, was your water, all of it desalinated?
Were you in one of those countries?
No, not when I was growing up.
The water was shipped from across the globe because a lot of the Middle East doesn't really
(01:02:23):
have it.
But desalination has, I think, again, from a moment in time perspective, that option
15 years ago when I was going to the municipality, it was one of the many considerations and
the focus was on water use reduction.
I think where we are today is water use reduction is a nice to do, but we really need to focus
(01:02:49):
on alternative solution very quickly.
And because of that desalination is not a nice to have, but a must do.
Yeah.
I've seen in near Dubai, there are these fission powered desalination plants that are like
(01:03:11):
hundreds and hundreds of acres of tanks.
And it's just like this ecosystem of one form of energy feeding the other and then producing
water as well as electricity and all of these things.
But they're complex infrastructures and they're very, very expensive to build.
(01:03:31):
It's interesting because I feel like water is almost more important than fuel.
That most of the conversations in the sort of popular populous ethosphere is about fuel
and not that much about water.
I worry more about water.
(01:03:53):
So that'll be interesting.
To add to your comment about water, what I want about is in a world where we start to
put an emphasis on distribution of water and there's less local sources of water and water
gets distributed in bottles and so on.
There's then the issue of microplastics and water and how that creates health impacts.
(01:04:15):
Yeah.
It's disgusting.
I was reading something about yoga pans being made out of plastic.
Recycled plastic.
Yeah.
I think it breaks your skin barrier and gets in your bloodstream.
Yeah.
Yes.
I think I've known of this.
(01:04:37):
Sometimes the early signs, the creative part of you hear about the early signs before it
becomes mainstream.
That one and also the issue of PFAS and Teflon and those nonstick products showing up in
the bloodstreams.
I've known for a long time.
Yeah.
No, I meant like, yeah, I think we live in a place where we have the luxury of knowing
(01:05:01):
these things early in the game.
I was watching a TikTok video, by the way, I'll tell you, where this woman literally
in order to make her food crispier, she used to, as her oil was frying and she used to
have her potatoes or chicken or whatever in the oil, she used to throw in a plastic bag
(01:05:22):
so that her food would come out crispier.
And she fed thousands of people this way.
And it was mind blowing to me.
This was happening like last year and this guy was talking about how his grandmother
does this.
And he was saying it in the most nonchalant way where he was like, hey, I have a hack
for you.
Do you know how you make your food crispy?
(01:05:44):
Throw in a plastic bag.
Are you kidding me?
Oh gosh.
Yeah.
So I think we have the luxury of knowledge.
Yes.
I mean, the awareness piece.
And now, if you go back to our AI conversation, think about a world where every single person
is getting their own versions of truth and their own hacks without like there being a
(01:06:05):
collective base of knowledge that we consider facts, like plastic, that's bad for you.
In a world where this gentleman is not even getting served up content that tells him plastic
is bad for him, he's going to continue to do it.
No, it was a woman.
It was a grandma.
Or a woman.
Or a grandma.
That's right.
(01:06:26):
Yeah.
So she's going to continue to do it.
And I've definitely been in situations where when I did my community work, I met a family
who came to one of the community meetings and talked about how their child had eczema
and one way she was attempting to solve for it was putting the kid in a tub of water with
bleach.
(01:06:47):
Oh my gosh.
Exactly.
Exactly.
So we truly have the knowledge gap.
Yeah.
I just wonder if the society, if AI can be used for good, but like will that exacerbate
this knowledge gap and create worse outcomes for some in the tunnel, folks that are already
(01:07:08):
disproportionately burdened.
Yeah, no, I know.
I feel like even more that this just makes me even more on the side of we need AGI to
rescue us from humanity.
We should.
I'm just like, oh man, people.
Yeah, cool.
This might be a good starting, a stopping point for us.
(01:07:31):
What do you think?
On the light or not?
Or not, or not, depending on who you look at it.
I don't know.
Yeah.
Now that we've had enough, there's a good stopping point.
That's true.
What do you think?
Anything you want to leave us with here?
Anything you want to say about what do you think, not what do you think it will look
(01:07:57):
like.
Yeah, maybe what do you think fusion will look like in 20 years?
But more like what do you want it to look like?
I want to know that more from you.
What do you want fusion to look like?
What I want is if I had a magic wand, every county, I think about how people have greater
(01:08:18):
connection within their local communities, within their county level.
But everything that happens beyond the county, even at the state level or federal level or
global level, there's so much cognitive overload they cannot connect to.
So I see a world where within their little city or county unit, there's a clean energy
(01:08:40):
source that they're able to tap into.
And they have a full awareness of not just their energy portfolio, but their environmental
burdens, health burdens.
They're collectively able to make decisions within that county unit.
Wouldn't that be great?
We talked about zoning.
What gets located within that?
What kind of development happens within that city or county unit?
(01:09:03):
That would be great.
That would be the world that I'd love to live in.
I know how to influence my little county to introduce more parkland or make a choice about
affordable housing versus the library and so on.
That, I think, is ultimate.
I think we're far from it because the energy choices are not really choices that can be
(01:09:25):
made at the local level.
Those choices are made for us.
And that's what I'd like to see change.
That's awesome.
Are those choices made for us because of why, do you think?
Is it because we have a capital incentive and we should have more of a social incentive?
(01:09:46):
Is it just that?
I think today, if you as a consumer think about your own energy bill, do you have a
choice to go to a different utility provider and say, no, I don't like your energy mix.
I like somebody else's energy mix.
No, you don't.
So that's what I mean. We don't really have a choice in terms of who your provider is
and what does their energy mix.
(01:10:07):
There is a negotiation that's hopefully happening between the utility provider and a state
agency perhaps, a federal agency.
But there's less choice that an end user has to say, hey, I do this.
I want to invest more money and change my energy mix.
You don't have that choice.
And that's the choice I think I would like to have.
(01:10:30):
Yes, me too.
I wholeheartedly vehemently agree with that so much.
I want that choice as well.
And I didn't even know how much I wanted it until you articulated it.
And now I want it more than ever.
Perfect.
When you start to make choices on your own, you can.
The choice I've been able to make is to have a solar roof.
(01:10:52):
I have a solar roof.
I have an electric vehicle and I have a charger in my home.
But if you give me agency, I will make more of those choices.
But when I don't have agency, I cannot make them.
Yeah.
A big example is like the lack of the number of cell phone service providers in
America versus Asia.
And the competition of it and the price fixing and all of that, it's annoying.
(01:11:18):
The price fixing is annoying.
It doesn't bode well with a free market and that bothers.
Yeah, I agree with you.
Yeah.
Thank you so much for this, Sushma.
Thank you.
On that note, on that vision, we're going to make that happen, right?
Yeah.
Cool.
Any quick plans for the weekend?