Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:07):
In five years, we'll legally own our own thoughts. That's
a premise of today's conversation. I'm Asimazar. Welcome to the
Exponentially podcast. In the past five years, over a billion
(00:28):
dollars has been invested in developing new devices that can
read or even manipulate our mental states. They can help
us relax, learn, or reduce pain, and as they do,
they harvest data. So how can we make use of
this technology without firms taking advantage of us? Do we
need a new fundamental human right, a right to our
mental privacy? To discuss the ethics of this emerging neurotechnology.
(00:52):
There's no one more qualified than Professor Nita Farahari. Nita,
you're a professor of law and philosophy, and you got
there by way of genetics and cell biology. Did you
(01:15):
have much fun at college?
Speaker 2 (01:17):
I did.
Speaker 3 (01:17):
I probably had too much fun at college, But you know,
I came into college already really enamored with genetics and
behavioral sciences, and while I was there, ended up doing
a lot in both genetics but also neuroscience and government
as an odd pairing between my interests.
Speaker 1 (01:37):
Also, how did you show up to college with a
deep interest in those quite different subjects.
Speaker 3 (01:44):
As a high school student, had taken a genetics class
and found that not only was I really interested in biology,
but I really enjoyed the mathematics, the modeling, but especially
the behavioral ties.
Speaker 2 (01:58):
I found that fascinating.
Speaker 3 (02:00):
But I also was a policy debater in high school,
and I was recruited to college for policy debate. And
I had spent summers as a geeky nerd at debate
camps and about debate camps.
Speaker 2 (02:12):
I mean, they were fun.
Speaker 3 (02:12):
I enjoyed it, but it's a geeky endeavor. And the
result was I was very passionate about policy. I was
thinking about the implications of policies for society and for humanity,
thinking about the broader implications and applications of technology and science.
I think I sort of didn't realize it at the time,
but was destined to follow the path that I ultimately did.
Speaker 1 (02:34):
You've turned your attention to one of science's remaining frontiers. It's,
in a sense, the last bastion of our freedom. It's
the nature of our brain, our cognition, and our mental state.
And I wanted to share a story with you. A
few years ago, I was sent a headset from a
startup that was meant to stimulate your brain to improve
(02:56):
your cognitive performance, and you put it on your hair
and it would zap you with some direct current, and
you know, I was too scared to use so I
put it.
Speaker 2 (03:06):
That's probably a good thing.
Speaker 1 (03:08):
I put it back in its box and I now
use it as a shelf for my print up.
Speaker 3 (03:13):
It's probably a good thing that you didn't use that
one back in the day, because we don't really have
a very good understanding of the zapping your brain. So
as you think, as you do anything, neurons are firing
in your brain. They give off tiny electrical discharges. And
when you're doing anything from feeling to smiling, to thinking
(03:34):
to doing a math problem in your head, hundreds of
thousands of neurons are firing in your brain giving off
concurrent electrical discharge.
Speaker 1 (03:42):
And so we can measure those electrical signals. I guess
that approach is the EEG. And if anyone is old
enough to remember Ghostbusters, Rick Morana's character wears he's wired
up to one. That's right early on, and yes, we
can we can measure those electrical impulses and also their
waveform and whereabouts in the brain that's right, they're happening.
Speaker 2 (04:05):
That's right.
Speaker 3 (04:05):
Now, if you're using consumer grade electro and sepilography or EEG,
you're going to have far fewer electrodes, and so it's
really the average across the entire brain rather than picking
up from a specific brain region. You might have it
in the form of earbuds or headphones or headbands, and
they're picking up the averages of those waveforms, and then
those averages, through software and advances and artificial intelligence, can
(04:29):
then say Okay, this is happy, this is sad, this
is paying attention, this is fatigue, this is mind wandering.
So they're kind of big brain states. They're not very
precise kind of mind reading, right.
Speaker 1 (04:41):
So those are all emotional states that you could imagine
matter to students who are studying as people who are
working in desk based jobs are actually in non desk
based jobs in the field.
Speaker 2 (04:56):
Right.
Speaker 3 (04:57):
So already this particular company has partnered with a number
of corporations as an enterprise solution where if it's offered
as part of a brain wellness program, for example, to
allow an employee to be able to train their focus.
People are really distracted in today's world, right, and they
have so much context switching. You're on your computer, you
(05:18):
then quickly go buy something for your kid's birthday party.
You remember you get a notification on social media, and
that context switching is costly for your ability to focus.
Speaker 1 (05:29):
And it's very hard to judge in the moment or
even after the fact, when you were concentrating well and
actually whether you were genuinely in a moment of flow
or a moment where you've blocked out the rest of
the world. So what you have here is a more
objective measure of where you might be actually, in the
(05:50):
same way that smart watches can really give you a
sense of whether you actually did walk the ten thousand
steps or not.
Speaker 3 (05:56):
I was talking with a company that has used this
technology on employees where they were trying to make managerial
level decisions about work from home policies versus work in
the office, and they were looking at some of the
brain data to see the extent to which people were
more distracted or better able to focus in a home
based environment versus in the office environment. Or you know,
(06:17):
you think that you're really a morning person, but that's
when you're in the best data flow, and it turns
out your best working hours are like three to five
in the afternoon.
Speaker 1 (06:24):
What happens to people when they get told that, after
years of thinking that they were a morning person, they
turn out to be a light out? I mean, how
do they feel?
Speaker 2 (06:34):
You know?
Speaker 3 (06:35):
I think for some people it's helpful, right, Oh, that's
why I'm stuck. That insight is so incredibly powerful for me.
For other people, it causes self doubt, a bit of
an existential crisis. To what extent can you trust your
gut instinct, your so called inner voice when the data
doesn't stack up or measure up with what your own
(06:55):
sense of self is. And then for others there's just doubt. No, no,
the technology must be wrong because my own perception of
self is much more accurate than any technology could ever measure.
Speaker 1 (07:04):
So these are all consumer grade devices that are in
a way leaking into the workplace. But of course there
are a whole range of different neurotechnologies, some of which
are much more complex and sophisticated used in healthcare, some
of which are just not as portable as these devices.
Speaker 3 (07:20):
These right now primarily or entertainment or very low stakes
decision making. They're not diagnosing a concussion on the football
field or for a person with epileptic seizures, maintaining real
time information about what's happening in their brains, or for
diagnostic purposes for Alzheimer's disease or Parkinson's or other kinds
of conditions, and a lot of the other more sophisticated neurotechnology.
(07:42):
And by sophisticated, I just mean it has better signal
to noise. It's able to more precisely pinpoint in the
brain where something is happening. You know, you have something
that gives you clinical grade information about brain activity or
brain disorder. And you know those really run the gimmut
from electrical stimula or implanted neurotechnology, where there's been tremendous
(08:03):
advances in the past few years. Companies that are investing
in the ability for somebody who's a paraplegic to be
able to navigate their environment, or a person who has
lost the ability to communicate to be able to speak
their mind by using implanted neurotechnology.
Speaker 1 (08:21):
To these are the so called invasive neurotechnology. In other words,
they're physically in the brain. They probably require you to
go into surgery to fit them.
Speaker 3 (08:31):
They carry for obviously greater risks because they entail surgery.
But what you can get from going deep into the
brain right now is obviously much more powerful than having
to pick up a signal that goes through the scalp,
through the skull where it's degraded over time, and far
lesson it can be picked up.
Speaker 1 (08:47):
And what's the role of what I imagine as the gold
standard of all of this, the MRI machine. You lie down,
you go into this tunnel that these very powerful magnets.
There's a lot of clanking, but then you get these
very small slices of your brain and the blood flow
activity within them. And these machines are in big they
(09:07):
weigh thousands of pounds, they cost millions of dollars. Is
that the gold standard to which we try to reference everything.
Speaker 3 (09:14):
Functional magnetic resonance imaging machines. They can peer deeply into
the brain. They don't have very good what we call
temporal resolution, which is they're not fast, they're not measuring
rapid changes across the.
Speaker 1 (09:25):
Brain like high high resolution photographs. Yes, rather than a film.
Speaker 3 (09:30):
That's right, and so, but the ability to peer deeply
into the brain and to be able to see at
literally like this is what's happening, or this is what's
going wrong. That is the gold standard for at least
diagnosing some things.
Speaker 1 (09:43):
It seems that in recent years we've started to close
the loop from lots of different ways to read brain
states through to how we can then change those brain states.
Talk about some of these implants. It seems like some
of the technologies that you've been discussing are as much
(10:05):
about reading the brain state as they are about imposing
a new brain state on us.
Speaker 2 (10:09):
That's absolutely right.
Speaker 3 (10:10):
So, for example, there was a woman who was suffering
from severe depression and she described herself as being at
the end of her life, no longer in a life
worth living, and neuroscientists and physicians were able to track
the specific patterns of brain activity and then use electrical
stimulation in the brain to reset that brain activity like
(10:31):
a pacemaker for the brain. That enabled her to recover
her will to live, to overcome depression, and have a
typical range of emotions instead. Wow, that's pretty extraordinary.
Speaker 1 (10:43):
That's extraordinary because right now the disciplines of neuroscience and
psychology are are rather different, Yes, and they're separate. And
what you've given an example of is using neuroscience to
try to tackle something that would have traditionally just been
seeing there's a statological problem. That's right, So let me
(11:09):
summarize where we are. We've got this fantastic science and
engineering which is progressing, and it's giving us a new
class of neurotechnologies which we can use for diagnostics, for therapeutics,
for augmentation. But these are also dual use technologies. They
could be applied in ways that are harmful. Perhaps. I
(11:30):
remember reading about a school in China, for example, which
for a short period of time was putting on a
brain monitor on the kids and allowing the teachers to
figure out which ones had drifted off. Now, I think
it's a right for a young child to zone out
as school. I mean, we all did. What are the
(11:50):
other examples that you can think of that are showing
the potential drawbacks of this technology?
Speaker 3 (11:55):
I think anytime you're looking at, for example, coerced use
of it that's in the workplace or in a school setting,
that it's deeply problematic. And that's because not only do
you have a right to have your mind wander, but
you have a right to think freely. Right the ability
to be able to have a dissident thought or have
a creative thought or even fantasize about the coworker in
(12:19):
your office, so you have a bit of a crush on,
like there's.
Speaker 1 (12:22):
I work on my own.
Speaker 2 (12:23):
Well there you know, no crushes, no qrushes for you.
Speaker 3 (12:25):
But you know these flashes of bad thoughts you have
For a moment, somebody cuts you off in traffic and
you have a bad thought that pops into your head. Right,
being able to think freely without fear of having your
feelings or thoughts intercepted or manipulated or punished, I think
is critical to human flourishing. In other context, it's already
being used in these kind of coercive ways, like in
(12:47):
police interrogations where people's brains are being interrogated to see
if they recognize. Yes, it's already happening. There's a company
in the US that has been selling their technology to
law enforcement agencies worldwide. Whether it's that technology which has
been used in places like the UAE or in Singapore,
is being tested out in Australia, and there have been
(13:08):
criminal convictions that have occurred as a result so called
criminal confessions in response to brain based interrogations. It's already
happening worldwide. Those are I think very chilling applications of
this technology.
Speaker 1 (13:21):
I've read a slightly scary example of a beer company
trying to change the way we dream so that we
would dream about their beer. I mean, how does that
even work?
Speaker 3 (13:34):
So neuromarketing has been around for a while. There are
certain periods of time, like when you just wake up
from a dream, that you're more suggestible. And by more suggestible,
I mean your mind can be changed more easily. Yes,
And so in that suggestible state, what some dream researchers
were hired to do or to figure out could your
(13:55):
dreams be used as a time to incubate particular ideas.
They call this dream incubation. So this particular beer company
they had as the kind of logo associations of mountains
and streams, and they want you to think of their
beer and think being refreshed.
Speaker 1 (14:10):
I'm almost speechless. It really sounds horrible, and it does.
Speaker 3 (14:14):
Yeah, No, it does sound horrible, and I've gone to
the most dystopian place with it possible. So now imagine
you're wearing your overnight brain centers. Because there are earbuds
that are designed to detect your sleep and to help
you monitor your own sleep. You have your you know
speaker that's listening in your bedroom. They're connected together at
the moment at which your most suggestible, a soundscape starts
to play without you even realizing it. And this is,
(14:36):
you know, the newest place that people pitch their advertisements.
Speaker 1 (14:40):
That logic is completely inverted. The moment when we're most
suggestible is the moment when there should be.
Speaker 2 (14:47):
No right from us now one hundred.
Speaker 3 (14:50):
Percent agree right? This is just like the dystopian vision
of this is. If advertisemers are trying to find the
moment that you are most likely to develop positive associations
with their brands or their products, and there are no
regulations that would prevent them from being able to commodify
your brain data and to target you with advertisements, then
why wouldn't they right? And the only thing that would
(15:11):
stop them from doing so is ethical norms that would
strongly develop against doing so and legal rights that would
protect people from having that happen to them.
Speaker 2 (15:19):
If you could take.
Speaker 3 (15:20):
A person who's suffering from trauma who voluntarily said like, yes,
I would like for treatments while I'm sleeping to incubate
positive memories or associations instead, it's not a bad thing.
And done a controlled research environment with consent that seems
potentially like an incredibly positive application of it, when instead
what you have is what seems like gimmicks. But I
(15:43):
worry that each instance of the use of neurotechnology in
these unexpected associations and unexpected places subtly and very perniciously
normalizes people to a future of neural surveillance where even
our brains are hacked and tracked.
Speaker 1 (15:59):
These technologies are improving exponentially, and there are these combinations
as well. So I've been tracking a variety of experiments
where scientists have connected the readings from fMRI machines to
different types of AI, and they've been able to reconstruct
the image that you've been thinking about. You've been thinking
(16:20):
about a church, and they can produce a sort of
blurry church. And over the last four or five years
that church has got sharper and sharper and more precise,
and even more recently, scientists have been able to look
at fMRI readings and predict the words that somebody was
thinking about. Now, of course, an fMRI machine is this
(16:41):
huge machine, not that practical, but the point being that
once we are able to correlate activity as we see
it through an fMRI machine to particular sequences of words
or pictures. We can then find easier to read signals
and then without needing to pop you into an fMRI machine,
start to extract those thoughts or those pictures.
Speaker 2 (17:03):
Yeah.
Speaker 3 (17:03):
So there was a recent study that was done that
was really a leap an advance in this area using
the advances in generative AI, and were able with a
very high degree of accuracy to reconstruct continuous language from
the brain. They were then curious as to whether those
same findings could apply to a different portable system called
(17:25):
f near's functional near infrared spectroscopy, which picks up blood
flow changes in the brain just like aphemeridas. Could it
apply in that context as well?
Speaker 1 (17:34):
So blood flow being the key thing that appears to
connect to what we're thinking.
Speaker 3 (17:39):
They trained it on f nears to see if it
worked in that context, and it did.
Speaker 1 (17:43):
It sounds magic. It is a way of peering into
the brain using light. It's sort of infrared light are
the kind that might come out of your TV remote control.
Speaker 3 (17:54):
There are others who are working on something like a
bike helmet that would have FEARS and enable us sortably
to be able to pick up even potentially our very
thoughts from our brain that could be decoded using these
advances and technology.
Speaker 1 (18:07):
And of course, hundreds of millions of us have these
smart watches that actually have quite cheap sensors in them
and yet can be really accurate for measuring things like blood,
oxygenation or pulse and all sorts of other biomarkers. So
we've got some expertise in taking lower cost, low fidelity
(18:29):
signals and turning them into more reliable.
Speaker 3 (18:31):
Signals, that's right, And to be able to filter out
things like noise, even the ability to have the computation
which used to be very difficult to do. You'd need
a huge room to do the kinds of crunching of
the data. They now are done on device literally like
a little device like this together with something like a
mobile device, you can do incredibly sophisticated both imaging and
(18:51):
decoding at the same time.
Speaker 1 (18:53):
How long before there are going to be affordable portable
fnears devices that you might see employers buying for their
workforce that's.
Speaker 3 (19:05):
Both affordable for them to buy and make sense in
some ways to distribute to employees. And that's for a
couple of applications. Those are to do things like for
brain wellness programs to decrease stress levels and employees, or
this one I use for focus. You can retrain your
brain to have longer periods of focus and increase the
(19:26):
activity in your prefrontal cortex, which is a form of
cognitive enhancement.
Speaker 1 (19:30):
The idea need to view with your multiple degrees and
to professorships having even more focus.
Speaker 2 (19:38):
It's an unfair advantage.
Speaker 1 (19:40):
It is. It is. We have these very powerful technologies
that are improving exponentially, and we can see many of
the upsides. We can also see how they can be
used against us. The thing that strikes me about these
(20:02):
technologies in particular is that the brain is that last fortress,
and our inner self is the thing that defines us.
And yet we're at this turning point where that inner
space could become part of someone some business's private enclave,
or some governments or welly in fantasy. What do we
(20:25):
do about it?
Speaker 3 (20:26):
So I worry about this deeply, and I worry that
that final private space, the space that I think is
so critical both for self awareness and for resilience, for
the ability to know truth from fiction. The emotional self
reflection and cultivation is at risk of brain transparency.
Speaker 1 (20:46):
You mean the idea that we can just be seen
through entirely.
Speaker 2 (20:51):
Essentially, right.
Speaker 3 (20:52):
I think most people consider having that space of mental
reprieve being able to think about what you're going to
say next, with whom you share information with whom you're vulnerable.
Speaker 2 (21:03):
Right.
Speaker 3 (21:03):
The way we define intimacy and a lot of relationships
is by choosing what emotions, what thoughts, what secrets we
want to share with another person, and if all of
that can be revealed by decoding your brain activity, if
other people can have access to it, whether it's your
partner who says, no, no, I want you to prove
that you're in love with me, or you know, your
employer who says, you know, I need you to have
(21:26):
five hours of focus today.
Speaker 1 (21:28):
Or show me you gave me one hundred percent that's.
Speaker 3 (21:31):
Right, prove it to me, or you know, a lie
detection test by the governments, or the way you authenticate
yourself at borders is through brain transparency. I think this
final space of mental reprieve.
Speaker 2 (21:43):
Is at risk.
Speaker 1 (21:44):
Your suggestion is cognitive rights. What do you mean by that? So?
Speaker 3 (21:49):
I believe that a right to cognitive liberty as an
international human right or right to self determination over our
brains and mental experiences, would give us both a right
to access and change our brains if we choose to
do so, but are also a right from interference with
our mental privacy and our freedom of thoughts. Those can
be simply updates to our existing interpretations of right.
Speaker 1 (22:13):
Already in the Universal Declaration of Human Rights, the right
to privacy, the rights freedom of thought and belief, and
the right to self expression without undue influence.
Speaker 3 (22:21):
You can look to the right to privacy and say
what we need to make explicit as right to mental
privacies included within it.
Speaker 2 (22:27):
You can look to the.
Speaker 3 (22:28):
Right to freedom of thoughts, which has currently been really
interpreted much more narrowly to be about freedom of religion
and belief, and say, no, no, this is also about interception,
manipulation and punishment of our thoughts. And we have a
collective right to self determination, a political right which really
if we look at all of the existing rights, we
can say an individual right to self determination is fundamental
(22:51):
to every right that is part of the.
Speaker 2 (22:53):
UN Declaration of Human Rights.
Speaker 3 (22:54):
So it's not that I think we need new rights,
it's that we need a new umbrella concept, which is
cognitive liberty to help us and direct us to update
our interpretation of existing rights.
Speaker 1 (23:05):
The other side of all of that, of course, is
that these are much more powerful and useful technologies for
beneficial outcomes. So how do you propose drawing a line
between the acceptable and unacceptable uses of these newer technologies.
Speaker 3 (23:21):
So I have framed cognitive liberty is a right too
and write from and I think the right too is
critical because the truth is, we don't treat our brain
health and wellness nearly as seriously as we treat the
rest of our physical wellbeing. We don't have access to
information about our own biases and preferences, even our own
cognitive decline over time. Self insight, I think is powerful,
(23:42):
which is why it's a positive right to access information
about your brain, a positive right to be able to
use it. Notice me saying you right. It's about you
being able to access and use the technology, you having
control over your own brain data, it not being commode
and misused against you, and a right from other people
(24:04):
coercing you to use the technology or taking your brain data,
to analyze it, to mine it, to share it, and
to sell it. It's giving you and putting you in
the driver's seat of your own brain.
Speaker 1 (24:15):
Even if we are in the driver's seat, we do
have precedence from other types of therapeutics. I'm thinking about
attention deficit disorder therapeutics. I'm thinking about antidepressants, which emerged
over the last thirty or forty years, and particularly in
Western markets, particularly in the United States, are heavily over
prescribed because it's easier to deal with some of these
(24:39):
issues through a prescription than perhaps figuring out how to
tackle those issues. These advanced neurotechniques will be more sophisticated,
more precise, more effective. So why wouldn't they fall foul
of the same type of behavior that sees them just
being used pervasively, even if we are notionally in the
(24:59):
driver's seat.
Speaker 3 (25:00):
They could be right, I mean, I think that's the risk,
and there are risks to individuals using them to shortcut
their own process of self discovery and self awareness and
growth and emotional developments. It can be misused by others.
It can be misused by parents, well intentioned though they
may be, to try to redirect the brain activity and
brain development of their children. I think in order for
(25:23):
us to effectively realize the potential of this technology and society,
we have to not just have rights right, not just
have norms, but we need to also be substantially increasing
people's awareness of how their brains can be accessed and changed.
What does individual cognitive liberty and cultivating that and individuals
look like. What does it mean to cultivate it in
(25:44):
children or in the workplace, or to invest in cognitive
liberty for businesses and for corporate developments, for product developments,
for employee wellness.
Speaker 1 (25:53):
Given the urgency the technologies are getting better and better
at exponential rates, I'm a bit perplexed about choosing the
UN as the route to do this. It's not an
organization that in recent years people would say is adept
at dealing with change, or indeed effective when it does
(26:14):
finally get round to doing it. So is that the
right way to approach this first?
Speaker 3 (26:19):
It's not the only way, but as a starting place
to say, we need worldwide to recognize as an international
human rights that we have a right to cognitive liberty,
and that that simply directs us to update existing human
rights and our interpretation of it sets both a strong
legal norm which doesn't require everybody to come together. It
requires the existing oversight bodies like the Human Rights Committee
(26:41):
that oversees the International Covenant on Civil and Political Rights
to update their interpretation and application of existing rights to
protect people. Even if we don't succeed, meaning even if
the UN takes zero action in this space, that doesn't
prevent us in the United States and Europe and other
countries from interpreting existing human rights consistently with the idea
(27:03):
that there's a right to mental privacy, that freedom of
thought is broader, that we have an individual right to
self determination.
Speaker 1 (27:09):
I mean, it strikes me that even if this isn't sufficient,
also self regulation that's right could be very, very valuable
if you're able to persuade the handful soon to be
dozens of companies making these technologies that cognitive rights matter.
That also creates some kind of a buttress for them
(27:31):
in the near future.
Speaker 2 (27:32):
I think that's right.
Speaker 3 (27:33):
I think that could be simple things in product design,
when you have multifunctional earbuds to be able to turn
off brainwave reading while you're taking a conference call if
you choose to do so, or to recognize the mental
privacy of their employees, and to say here's technology you
can use to improve your focus we won't collect the data,
we won't use it to make choices about you.
Speaker 1 (27:56):
Now, the premise of our conversation is that within five years,
our thoughts could be legally protected. How likely do you
think this vision could become reality?
Speaker 3 (28:05):
Well, I'm an optimist, I would say, and I believe
that we have to get this right, and that the
only way to get this right is to give people
strong rights and protections over their thoughts from being intercepted, manipulated,
and punished. Within five years, I think the technology is
going to be mature, and it's going to be widely
available at scale across society, which is why I believe
(28:28):
before that happens, we will make moves to recognize or
write a cognitive liberty to enable us to be empowered
by rather than oppressed by the technology.
Speaker 1 (28:37):
Anita Farrehani, I don't have to tell you what I'm thinking,
but there has been a tremendous pleasure to speak to you.
Speaker 2 (28:43):
Well, such a pleasure for me as well. Thank you.
Speaker 1 (28:51):
Reflecting on my conversation with Nita, I can't help feeling
that the technologies we've discussed will be mainstream even faster
than we think, and two hundred million of us wear
smart watches, and these collect personal information every second of
the day. It's becoming a normal experience, and the exponential
improvement in technologies will allow devices like these to monitor
(29:13):
us more closely. Yes to the point of tracking our
mental states. So it does make sense to protect our
minds our thoughts. But as going via the slow moving
un the best route or can we together find a
better way? Thanks for listening to the exponentially podcast. If
(29:37):
you enjoy the show, please leave a review or rating.
It really does help others find us. The Exponentially podcast
is presented by me azeem as Are. The sound designer
is Will horricks The research was led by Chloe Ippah
and music composed by Emily Green and John Zarcone. The
show is produced by Frederick Cassela, Maria Garrilov and me
(29:57):
azeem as Are Special thanks to Sage Bauman, Grocott and
Magnus Henrickson. The executive producers are Andrew Barden, Adam Kamiski,
and Kyle Kramer. David Ravella is the managing editor. Exponentially
was created by Frederick Cassella and is an Eat the
pie I plus one limited production in association with Bloomberg LLC,