Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:05):
I'm Dr. Julie Pham, founder of CuriosityBased.
We help people practice curiosity in the world, starting in the workplace, because that is where we spend most of our waking hours.
Curiosity as a practice boils down to self-awareness, relationship building, and clear communication.
So join us as we interview leaders to see how they use curiosity at work.
Welcome to the Curiosity at Work podcast.
(00:27):
Today, I get to interview Dr. Andrew Robertson, who is the Vice President of Global Regulatory Policy and Innovation at Takeda Pharmaceutical, which is one of the largest multinational pharmaceutical companies in the world.
And I'm especially excited for this interview because I'm going to, Andy, I'm going to call you Andy.
(00:47):
We have known each other since 2001, when we met at Cambridge as the inaugural Gates Cambridge Scholars.
And it's so much fun to get to interview you today in this, over 20 years later.
Thanks for inviting me, Julie.
Yeah, it's good to see you again.
So Andy, could you just start by describing Takeda and what you do there?
(01:10):
How many people, how many offices, the functions?
So Takeda, it's a multinational pharma company.
We're Japanese, over 250 years old.
We have a number of products on the market, ranging from GI to neurological diseases to oncology, and even vaccines and plasma-derived therapies.
(01:35):
We have about 50,000 people.
And I'll say this, this is my third pharma company.
And the culture at Takeda is just absolutely remarkable.
We really try to do the right thing.
I know there's a lot of distrust in the pharmaceutical industry.
We take to heart that patients come first and actually in orders, patient, trust, reputation, business, in that order.
(01:58):
And if you were to talk to any Takeda employees, they'll tell you the same thing.
So the company has been fantastic.
I've been here for about, just going on four years.
And my role, my role is around basically R&D policy.
So when we are building our products, when we're actually developing our drugs, we have to look at not only what does the science say, and the science is critical.
(02:22):
We also have to look at the external environment.
So what is needed from an FDA perspective or an EMA perspective?
What makes good sense from a research perspective in terms of funding and innovation?
How can we actually use these and put these in place, these external developments in place so that we can be faster and more efficient at what we do?
(02:42):
So with that, it was a new group that was started when I came on board.
We went from two people, myself and one other person.
Now we're at about roughly 30.
The makeup is between scientists and attorneys.
We have some nurses, we have one medical doctor.
And the split is roughly about, I think it's about just more than half are full-time employees.
(03:08):
But we have a number of consultants and contractors that we consider full-time staff.
It's a great team.
It's a growing team.
I'm really proud of it.
I would put us at one of the best in the industry.
But yeah, that's what we do.
And so is your team remotely distributed then?
Because it's global regulatory.
So does that mean they have to know the policies of all the countries that you have markets in?
(03:33):
And it's remarkable because we've got, so yes, we've got teams in Japan, China, Singapore, Europe, and of course in the US.
And the really interesting thing about this is you get to see drug development, pharmaceutical drug development from a global perspective.
And normally, like when we did this before, there was a lot of silos in how the US thinks about this and how Europe thinks about this.
(03:58):
And while there might be some crosstalk, we never really saw a full integrated comparison between the two policy groups and the two regulators.
What we're allowed to do now though is when, for example, Japan is starting to modernize their pharmaceutical policies, we can directly draw lessons from the US and Europe as to what works and what hasn't.
(04:20):
And so this has kind of given us an edge.
The team is great.
We stay in touch quite a lot.
We have a number of weekly meetings, but the communication cycles are a lot more fluid than some of the other companies that I've been at when, you know, periodic check-ins, we're in constant touch with each other.
So of that team of 30, how many different countries are they located in?
(04:42):
Doing a quick count, about eight or nine.
Wow.
That must be an interesting challenge for planning meetings for those time zone differences too.
Yeah.
Time zones are tough.
We do a little bit of, you know, I stay up late sometimes, they stay up late sometimes.
(05:03):
So we try to make it even.
When I first started, I was so intent on not putting anybody out that I would say, look, don't worry about it.
I'll stay up till one in the morning or two in the morning.
I can't do that anymore.
It's just, I was just up way too late.
My kids were getting upset with me.
So we just decided to do the alternating strategy and it's worked out pretty well.
(05:27):
And Andy, so I know that you have been at the company for four years.
Is this also a new role for you as in you didn't do this kind of work in your previous pharma experience?
No, actually.
So this is my third company doing this.
So I started off in another company, Merck, doing this really at like the director level with a couple of key therapeutic areas.
(05:52):
Then I was brought over to a French company, Sanofi, where I took over the U.S. function.
And then when my boss joined at Takeda, she said, we really need to have a strategic policy, integrated policy function.
And she said, Andrew, Andy, what can you do with this to build it up?
(06:13):
And so it was great because she kind of gave, she saw the value.
She and I had worked on policy and integrating it into product development strategy in the past.
And she saw the value of it and she said, look, this needs to be a cornerstone of what we're doing.
We can't just think of the external environment as an afterthought.
(06:33):
And yeah, we've been able to, what I call the but-fors, the but-for intervention, would this outcome have happened?
And there were several where, you know, we've been able to elevate the voice of the patient and why a certain drug is important or really revisit a data set to show why certain biomarkers are more indicative of a product's potential for benefiting a patient than maybe like the normal clinical outcomes.
(07:04):
So we've been able to do a lot of this.
And the ultimate result is getting drugs that we think we really believe in, getting them to the patient much faster and improving access.
So I've been able to build this up.
And like I said, I'll say this again, I think one of our team is one of the better ones in the industry right now.
So Andy, you've gotten to experience this role at an American company, a French company, now at a Japanese company where you were part of an established team and now here you built out the team.
(07:36):
And also policy and regulation is always changing.
And so I imagine that is one of the challenging things in the pharmaceutical industry where the policy is changing and you're also dealing with it in different markets.
So for my first question for you, first immediate question is we think of curiosity as a practice and sometimes we practice it and sometimes we don't.
(07:57):
And sometimes it can be really, really hard to practice curiosity.
Can you share an example of when it's difficult for you to practice curiosity at work?
Yeah, it's when the bandwidth just gets so constrained and you just don't have time.
There are instances where you just get flooded with information, flooded with change.
(08:20):
And the steps that you really need to take to kind of understand what the impact is, it takes some freedom of thought and you need to actually ask the what if questions or look for the unintended consequences or draw lessons from other industries or other specialty areas and see if they might apply to your particular situations.
(08:44):
The challenge you have though is that's kind of in many instances a luxury.
And in many cases, if bandwidth is constrained, you can't do this curiosity component.
But what I've tried to do for our team is really carve out about 10%, 15% of their time should be dedicated to what I would almost call innovative improvements.
(09:05):
So looking at areas where just because we've done something this way for the past five or 10 years, does that mean that we need to keep on doing it this way for the next, right now or in the future?
And what it's allowed us to do is to really explore new areas.
So when we're talking about policy, there's a reactive policy where the FDA might say we are going to do something and then we have to figure out how to adjust.
(09:28):
The other side of it is almost what amounts to advocacy.
So looking at areas where we think that the way that, for example, the way that we actually give information to the FDA is too slow.
And why aren't we using modern technology to actually put this up in a cloud so they can actually draw down information much faster, not only in the US, but globally.
This is like a what if where we had to really give ourselves the time and the space to think about this.
(09:54):
And not only think about what would it look like, but think about how do we actually get the ball moving, bringing on the stakeholders to get the traction, to get the visibility so that this becomes a priority for not only for us, but for the FDA, for EMA, for other groups.
And so we've done this.
We've done this in cloud-based technologies.
We're now looking at this for looking at new therapeutic models like AI, so we can maybe reduce our reliance on animal testing, see if we actually bring that down a little bit.
(10:25):
We're looking at other areas like can the patients themselves talk about in more qualitative terms instead of quantitative terms, what the benefit of a drug might mean to them.
There are instances, you can go back decades, where there are instances where things like irritable bowel disease, you might have regulatory agencies say, well, that's not really that serious of a disease.
(10:49):
But if you talk to somebody with it, they're limited in their relationships, they're limited in their job, their ability to take on jobs, limited in their mobility.
It is a life-changing affliction that we should actually take much more seriously and talk to them about what it means to have a meaningful therapy.
(11:09):
So that's a long-winded answer to say we're trying to give ourselves space to think about these new things and really think about not only the way things are now, but how could they be in the future.
Andy, I want to ask you a follow-up question because in your work, policy feels really slow.
That moves slowly.
And you said something about you can be reactionary, and then there's that being proactive, that advocacy part, actually helping inform the policy.
(11:35):
So on your team, how do you divide that up?
Are there some people who are more reacting to it and other people who are more pushing for new policy?
How does that work?
It's circular.
They're both.
We divide them up, but they're really integrated.
We have this term policy for impact, which is basically when we're looking at policy, it's not an academic exercise.
(11:58):
We really need to bring it back to our internal operations, our internal products.
What does it mean for the company?
And again, because the company puts patients first, that kind of meshes with everything, like my own personal beliefs.
So instead of the divide, we really look at is near policy and long policy.
So near policy being things like, okay, given the timelines of our development and the advocacy levers that we build, the foundations that we've built over years, what are the avenues we could take to look at new endpoints or look at patient populations and maybe build up that near-term impact on our programs?
(12:34):
But then there's the longer-term stuff, the stuff where we want to see a change in five to 10 years, but we need to start taking action today.
We look at both of those.
I'll be honest, the priority sometimes comes quite often comes back to the near-term policy stuff because that's where the urgency is.
We have to deal with near policy primarily because that's what's going to impact our programs.
(12:56):
That's where the urgency is, and that's what's going to help us best serve our patients is how do we actually make sure that our programs are moving forward?
But it ties very closely with long policy.
So when we're actually doing the not near policy, we're looking at things like, okay, we're trying to work with the agency, but wow, if there was a way we can actually get information to them faster, that would be a big fix for this issue.
(13:23):
Same thing with biomarker validation, which actually helps us determine quickly if our programs are working or not.
A lot of these issues that come up, and so when we say, well, we've seen this be an issue enough times, let's think about a longer-term solution.
Otherwise, we're going to be back here another year or two dealing with the exact same problems.
So we're very aware of the scenario today and where it could be in five years from now.
(13:48):
It is a very dynamic area.
There's new innovations coming out all the time.
A lot of noise.
We're looking for the signal.
We're looking for the things that we think could have impact.
What I'm hearing you say is that the near-term versus long-term, that urgency around near-term, that can create this sense of urgency because you were saying it's limited bandwidth, but also it's just limited bandwidth to focus on the things that need to be focused on trying to figure out that priority, especially when those near-terms typically have to do with your patients.
(14:25):
Yeah, absolutely.
Yeah.
That's right.
So it's a little more urgent too because they're live.
Yes, that's absolutely right.
And there is a, I mean, to be honest, it's like where the excitement is.
I mean, I think one of the risks with policy is that it's so easy to get detached from what you're actually doing.
But if we're actually looking at a lot of these issues and they directly go into how we're setting up our studies or how we're actually planning to engage with the regulators, that's an impact.
(14:53):
The other side of this too, Julie, is like there's, okay, this is a large company.
There's a credibility value, which is also assigned to this.
And so if we can help on the near-term, it gives us more freedom to look at the long-term.
And that's been something that's been, I think, when I was looking at my past companies, we kind of overlooked it.
(15:15):
We thought, okay, well, let's read papers.
Just think about the world in five years from now.
But if you can't, this is probably talking too much grass tax.
I had a great advisor when I was at Merck.
She said, look, one day you might be running your own group.
You have to have the fear of God in you that the CFO is going to knock on your door and say, why are we paying you all?
(15:38):
And I've taken that to heart.
If I can't actually tell them why we're here, what the value is, again, the butt whores, then forget long policy.
I won't be able to do the short policy.
I won't be able to do the today stuff.
That makes me think about how you have to get enough traction to build trust and just, oh, this is worth doing.
And then, Andy, what you were also saying about finding innovation, finding innovative ways to get that information to the FDA faster, more easily with those constraints in place.
(16:10):
I hadn't thought about that in terms of an area where you can practice curiosity to actually expedite that long term.
I mean, it gets me really excited because there is a vision here, right?
So right now, the traditional way for doing this is there's an electronic portal.
We take all of our clinical studies, all of our data.
(16:30):
It's a very strict format.
We put it through the portal, and then the FDA will look at it.
Then we do the same thing for Europe.
We do the same thing for all the other regions, and we do one by one.
The communication lines get very muddled very quickly.
By the way, this is an improvement because back in the day, there used to be a full-on 18-wheeler truck that would pull up to the FDA and just put the...
(16:51):
So we are making progress.
But now look, think about it this way.
Instead of doing this one by one, we do this in a cloud-based format, and then the regulators actually draw down the information that they need when they need it and do the reviews when they need it.
All of a sudden, you cut out a step.
But we're even thinking beyond that.
I mean, there's a...
If you have a data-centric approach here, you can actually get into a situation.
(17:15):
I mean, this is very hypothetical, and there's actually even some drawbacks, but you can get into a situation where you just put the data up in the cloud, and then you actually have the programs for doing the statistical reviews there.
And then one day, a company just says, gets a letter in congratulations.
Your 578th patient enrolled and was successfully treated by your product.
(17:36):
We now consider this an approved program.
Go forth and market.
There's some consequences to that.
I don't want to say it's the solution for everything, but you can think about how streamlined the continuity, the constant data stream could be for, again, accelerating these programs and really understanding how they're working and getting them to the patients faster.
(17:59):
I'm going to keep on coming back.
That point is really important.
So integrating technology as a expression of applying curiosity to the workplace.
I want to talk about the people side, because you manage this very diverse team just in so many different ways.
I mean, function, cultural, the, you've got lawyers, you've got scientists.
(18:25):
So can you tell me, how do you practice curiosity?
What's something that you do to, to practice curiosity on the team?
Because I actually think there's probably so much.
How do you, how do you actually figure out what you're going to, to focus on?
We, we orient again, we orient to the programs, the products, what is the impact that we would have?
(18:50):
We do have reputation as a metric.
So reputational boost could be something legitimate to kind of justify why we would go after a certain initiative.
The programs though, I think are also very tangible, very concrete.
Like if we can get this better.
And by the way, I'm not just talking about like the programs in terms of the trial design.
(19:14):
There's a whole operational aspect to what we're doing, right?
Like, if you think about how long it takes for us to go from, you know, what we, what we call like database lock, like locking the clinical trial to getting an application into the agency, it takes roughly around like 20 to 30 weeks to do that.
I think you got to ask the question, is there a faster way to do this?
(19:35):
Like, if you can actually divide that by half, you're, you're saving close to four months and think about all the impact you can have in that four months for the patients, right?
And so, okay, use that as a goal.
And then the way we typically do this is instead of saying, oh, here's a technology where we can, can we apply it?
(19:56):
We flip it.
Where do we want to be?
And then let's say, okay, we got there.
Now what do we have to do to get there?
So if I say, okay, the goal is going to be a 10 week from database lock to submission, but what does that solution look like?
Then we can backtrack and say, well, we need to have automated content generation because our writers are just taking way too long, or we need to have cloud-based submissions because the actual work of going from the FDA to the EMA to all these groups is just way too discoordinated, uncoordinated.
(20:24):
If we backtrack this and start with a solution and think about how we actually got there, then it almost gives us a roadmap for what we need to do.
So we've done this in a couple of instances with external partners.
We want to have this organization be a primary partner.
What do we have to actually do?
We have to boost our reputation, get our SMEs in there, show that we have a technological advantage over our counterparts, putting these things forward.
(20:53):
And then that sets up our plan for doing that.
So we've done it on external partners.
We've done it on trials.
And it's worked.
It's worked.
The challenge that we've got is you have to kind of go against industry norms and the tried and true processes.
(21:16):
So there's going to be a lot of instances where folks would say, well, what if it doesn't work?
There is going to be a risk associated with doing something new.
This is a strategic question we ask for a lot of our programs.
Do we want to be the innovators or do we want to be the fast followers?
There's no right answer on that.
(21:37):
I think being a fast follower can almost sometimes be more exciting because you're looking for that path and you might even do it better than the innovative company in itself.
I'm not talking only about drugs, but like processes.
(21:58):
So for Gen AI is a big buzzword right now.
Regulatory uses a lot of documentation that we take a lot of hours to create.
There's also a compliance issue around it as well.
What do we want to do?
Do we want to be the ones that actually work with these Gen AI companies to come up with a platform that can create this content for us?
(22:22):
Or we want to watch and see if another company does it first and we really like that other company, their approach, then we actually go in and we license in that solution.
There's advantages to both.
One is you get to shape something which really fits your processes, but it might be more risky.
The other is you can actually see how it works in the real world, but we are by definition going to be behind.
(22:46):
We do that for that.
We do it for trial design.
I don't want to go on too many topics.
It might confuse your audience, but for trial design, for example, typically we go through academic research centers.
There's been this whole concept and notion about decentralizing the studies.
So you can go out to community-based clinical study centers or even doing part of the study in your home, using digital tools to actually help gather the information from the patients about if the drug is helping their mobility or helping their sleep scores.
(23:22):
A lot of these are innovative approaches.
If we are the first ones to do it, that is great, but it also carries a risk to it.
So that's the balance.
I think that's the balance we need to address.
What I'm hearing you say is there's not a hard and fast rule each time you have to make that assessment.
So in this, I'm hearing you say you've got to think about where you want to be, that end in mind, and then open it up for that discussion.
(23:49):
How do you, with such a diverse team, foster that learning from one another?
Because I'm assuming people have assumptions.
They're just, this is the way it works in these countries that I'm used to, or this is the way it works in my function, and sometimes we take things for granted, what we know for granted, and we assume other people also know as much.
(24:10):
So how do you break that down?
You know, honestly, I think the only point I could take credit for is the hiring, but it's the people.
We hire people that are really driven.
One of the beautiful things about having this multidisciplinary team of a lot of different specialties is that we look at things from different angles.
(24:36):
I do my best to make sure that we are very communicative and we collaborate as much as we can, but they are the ones who are saying, hey, I've noticed we've been doing it this way.
If we just tweak this one thing, we could save X million dollars a year.
That's awesome.
I think the one thing that maybe I do is I hire in the folks, and then I kind of give them very broad, but very clear targets.
(25:02):
So doing things faster, doing things cheaper, doing things with better quality.
Those are the main key ones, and it comes back to that CFO discussion.
They say, well, why are you here again?
It's because, well, we added this value to the pipeline, or we were able to get these goals, but with a third of the resources.
(25:23):
Those are all things that we can actually contribute by keeping watch of the external environment, looking for those opportunities, and seeing how we can integrate them with our programs and our product teams.
That's the only credit I'll take.
The team does it themselves.
They are always challenging on status quo, and they're coming to me with ideas all the time, and it's fantastic.
(25:44):
So, Andy, this gets us to the question around recruiting and hiring.
So you grew this team from two people to 30 people.
What's a favorite question or something you look for in trying to ensure that person is going to be able to practice curiosity on your team?
My interview, I probably don't do the best interviews.
I think my interviews are really conversational.
(26:07):
I want to see if that person would be a good thought partner.
So I'll talk about the things that are bugging me for that day, or that week, or that month.
And, of course, you've got to ask the questions about their background and how they would address stuff.
But for the most part, here's a problem I have, and see what their response is to it.
(26:34):
One thing I've noticed is I really look for the person that they could become instead of the person that they are.
Does that make sense?
Yeah.
Yeah.
And that could be based everywhere from their reputation in the industry to just the way they address a problem.
(26:57):
A couple of my absolute rock stars right now didn't give good interviews.
Some of the interviews they gave were maybe very reserved, or I kind of didn't understand the point they were getting at until a little bit later.
But those are communication issues which I think could be taught in most cases.
(27:19):
The underlying thought process they had, though, really said, wow, you're getting into an area where they get it.
We're aligned in value.
We're aligned in saying that this needs to come back to having some meaningful impact for the company.
So when I interview folks, I'm trying to think, you know, how would my group be different with them?
(27:43):
We don't have many roles, but it's just they do a job.
Everybody's got to be creative and put them in the positions.
And then my job is just to give them as much resource and as much visibility to get the job done and kind of get out of their way.
I mean, they're just doing a great job.
(28:03):
And I'm very proud of my team.
I think it's great.
So I appreciate what you're saying about how it's not just about what they do in the interview.
It also sounds like you're looking at what they did before because you're looking for that potential.
Yeah.
Yeah, absolutely.
And I think that that really speaks a lot.
I mean, we've had not for my team, but for other senior leadership roles, we've had folks come in and we just ace the interview.
(28:24):
But then we do the background.
I mean, not the background checks, normal ones, but like say, well, OK, what's their reputation in the industry?
And, you know, it turns out very differently than their interview led us to believe.
And the opposite is true as well.
We've had folks that come through where people are like, well, I'm not really sure that's the right person or fit in the culture, but their reputation has been absolutely, you know, game changer status.
(28:47):
And we bring them in and they just absolutely crush it.
I think interviews, honestly, are such an imperfect mechanism.
You know, that's why I don't like going through these lists of questions.
My point is, can I have a meaningful conversation with this person where I can come out of it with feeling like I've learned something or feeling like I have a new perspective that I didn't have before?
(29:11):
If I go in, my dad, my dad was an executive in Silicon Valley.
He had the same because there was somebody who was on his team.
We just always agree with my dad and just say, yes, sir.
Yes, Dr. Robertson, that's right or no.
And my dad in the end had to sit him down and say, look, if you're always agreeing with me, you're not adding anything.
(29:32):
And that means one of us isn't needed.
And I'm not going anywhere.
And that's kind of I mean, I'm not as harsh about it, but I want them to feel safe and confident and secure, but they need to bring in their own ideas.
And that's what I look for in the interviews.
So I think the universal lesson here is don't just focus on the interview, actually look around the interview, talk to other people.
(29:54):
So, Andy, last question.
What is inspiring curiosity for you in your own world right now?
My kids are actually driving a lot of this.
It's really interesting to see them look at a problem or a situation with fresh eyes.
(30:14):
And I used to do this a lot.
I think everybody used to do this when they were a kid.
They would have that imagination and it kind of wears a little bit as we get older.
I think they sparked it again.
I mean, it's a cliche response, but I think they sparked it again because you see the excitement in them.
And, you know, it's something that sometimes I try to do naturally, but I think a lot of times I have to really force myself to get excited.
(30:41):
The stuff that I get to work in, the team that I get to work in is for something which I really believe in.
It's taking from science, it's looking at policy and how structures and systems are built.
It's really exciting.
And so when my kids get excited about a swim meet, how it's coordinated or a new book that I've read, I try to internalize that level of excitement with when I actually go back to work.
(31:13):
I'm sure I annoy a lot of people at work, but I'm okay with it because I got to be honest.
It's like, I'd rather go to work excited every day than feel some level of dread about this.
So.
That's beautiful to know that just having a conversation with your kids can spark that curiosity again.
So Andy, how do people get in touch with you?
(31:33):
LinkedIn is best.
I don't usually post as much as I would like to, but it's the one thing where I will circle back to, not right away, but, you know, once every few days.
And I do my best, especially with people that are looking for advice, I will do my best to reach out, give time, set something up.
(31:55):
And yeah, I mean, the one thing I'll say is I don't always have the best advice, but I think that's of anybody who's giving any sort of mentorship.
You should always keep in mind, but I'm happy to give from my perspective, but I think it's great.
Thank you so much, Andy, Dr. Robertson.
It's been such a pleasure to chat with you and you've demystified so much of pharmaceuticals and policy and how that, and all that regulation for me and for our audience.
(32:24):
So with that, thank you.
And listeners, please practice curiosity in the world.
Thanks.