All Episodes

July 3, 2025 60 mins

Many a business was launched in carbon credits trying to fix a conflict of interest problem. Has Absolute Climate cracked the code? Should registries get out of the methodology development business?

Peter Minor, CEO and Co-Founder of Absolute Climate, is on the show today to talk about the many issues of trying to create an ultimate standard in carbon removal (hence the amazing xkcd meme), and how he thinks the current system is set up to fail.

Are we doomed to always face conflicts of interest? Do we inevitably end up thinking not in terms of ultimate design victory, but balance of power? Is it all just "a dogfight in the Wild West?" Can we all agree upon the same standard?

Listen in to Peter and Ross think the big thoughts and try to white hat hack the system to advance CDR.

This Episode's Sponsors

⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠Arbonics⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

⁠CDRjobs⁠

⁠Listen to the RCC episode I did about CDRjobs' Salary Survey and why carbon removers should fill it out⁠

Fill out the ⁠⁠⁠2025 CDRjobs Salary Survey⁠⁠ HERE

⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠Listen to the RCC episode with Lisett Luik from Arbonics⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠

Become a sponsor by emailing carbon.removal.strategies[at]gmail.com

⁠⁠⁠⁠⁠⁠⁠Use this affiliate link to use Descript's transcripting and podcast editing service⁠⁠⁠⁠⁠⁠⁠

⁠⁠⁠⁠⁠⁠⁠Use this affiliate link to use Riverside to record your podcasts⁠⁠⁠⁠⁠⁠⁠

⁠⁠⁠⁠Sign up for the 9Zero climate coworking space with my referral code⁠⁠⁠⁠

Resources

⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠Become a paid subscriber of Reversing Climate Change⁠⁠⁠⁠

Absolute Climate

"Never complain, never explain"

Streisand effect

The Civil Law Tradition by John Henry Merryman and Rogelio Pérez-Perdomo

"Common law" article on Wikipedia

The Wide Lens by Ron Adner

⁠The Spotify image is the absolutely legendary and final word on the quest for One True StandardTM, courtesy of xkcd under Creative Commons. No infringement intended, just a celebration of the genius of their brevity.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
You have found yourself at the Reversing Climate Change
podcast. I'm Ross Kenyon, I'm the host.
Before we get started today, I want to tell you about the
sponsors of this episode. I'm so grateful we have a new
one, which is cool. Cdr Jobs, surely if you work in
carbon removal or aspire to, youhave been on the Cdr Jobs
website. They are the place to find all
of the Cdr jobs. No adulteration, no other

(00:24):
adjacency. It's carbon removal.
So that's the place to look. One of the things that is very
cool about being ACDR job posting site is that they get a
lot of data on employment in Cdrand they have not slept on their
data duties. In fact, last year they did ACDR
salary report, which details howmuch people make in the space,

(00:48):
if there's any discrepancies across gender or race, of which
countries, where are jobs being created in carbon removal.
They gathered over 800 data points, 400 individual
responses, and 400 salaries fromjob openings, which represents
only a partial sample size. But even still, that report was
generated A4 figure amount, which is a lot of times to

(01:08):
download a report. People want this information and
they're doing another survey this year.
They're doing another survey this year.
So recently I put out a small episode about why I think this
kind of work is important, aboutHR decisions, about pay
transparency. Why I think that's on net a good
strategy if you are an employer and also if you're an employee,
why you should talk to your colleagues about salary and

(01:31):
destigmatize conversations around that topic and why they
can be really useful. And I think the Cdr Jobs 2025
salary survey is a really powerful way to anchor that
conversation, to give you something to talk about because
it makes sure that employers arecreating a trustful environment.
And it's also making sure that employees are not being taken

(01:52):
advantage of in any way. I don't suspect that is
happening. Cdr is a small place that is
very mission driven, but it is beneficial for us to work
together to make sure that, you know, our peers are taken care
of or we ourselves are taken care of.
And if you are in a position of power, that you are creating a
kind of environment where peoplereally want to love the company
that they're at. And this is potentially one way

(02:12):
that you can do that. So I would say if you are
already working inside of carbonremoval, please go fill out the
2025 salary survey. The link is in the show notes.
And also if you're looking for ajob in carbon removal, Cdr jobs
dot Earth, that's where they are.
Go look through them, go apply for them.
If you're at a company and you want to sponsor their work,
that's also a possibility where you can get better visibility

(02:34):
for your jobs. So thank you.
New sponsor means a lot. Thank you, Cdr jobs team.
I appreciate your work and hope it continues for a long time.
We also have our beloved longtime sponsor, Arbonics is
back again. Arbonics is fascinating forestry
work in the EU, primarily in theBaltic States, and they just
released a new report on the state of European forest carbon
credits in 2025. It's a practical guide to how

(02:57):
forest credits are generated, verified, and how developers
handle permanence, leakage, and social integrity in a European
context. What's cool about it is that it
also breaks down how methodologies differ and how
pricing differs between these methodologies and why the timing
matters in the supply constrained market.
Unless you are a deep, very special type of nerd, this
content is hard to parse, but I think the work that Arbonics is

(03:20):
doing to try to make this easilygraspable by busy people is
really important. And you know, there's.
Really just not that many European horse carbon credits.
With demand rising and a four station projects taking years to
mature, many high quality credits are sold out before
issuance, which I'm sure you've seen this or at least heard
about it. This report is a timely overview

(03:41):
of bottlenecks in the space and the actors who are looking to
solve it. You can find this report in the
show notes. In any case, thank you Cdr Jobs
and our Bionics for your sponsorship means so much to me,
and now we will allow the show to begin in earnest.

(04:02):
Hey, thanks for listening to thereversing climate change
podcast. This is Ross Kenyon speaking.
I am the host of the reversing climate change podcast.
Been involved in climate tech and carbon removal for nearly a
decade now. No intention of leaving despite
the winds of change a blowing. Today's show is complex.

(04:24):
Lot of big questions about conflict of interest.
We get into the differences between the common law and civil
law traditions and which route carbon removal should take as we
move ahead. My guest today is Peter Miner,
who is the Co founder and CEO ofAbsolute Climate.
Absolute Climate made a fascinating design decision
where most people think that registries produce

(04:47):
methodologies. A registry in carbon markets is
where the carbon credits actually live, where the status
of those credits are tracked. It's the source of truth, and
registries often also have scientific teams that develop
methodologies through which carbon credits can be issued and
then be on that registry. Those are functions that have

(05:07):
historically been combined. That isn't always the case.
They're often consultancies thatwill work with project or tech
developers to develop new methodologies to serve them and
then look for them to have a home on a registry.
That's a nice happens too. But overall, methodologies are
developed by registries, and that's true both in legacy
carbon market and also new ones that focus on carbon removal.

(05:30):
The contrarian take here is whatif registries actually didn't do
methodology development because they're always trying to attract
project developers to work with them.
And registries therefore will face downward pressure to reduce
quality to attract more project developers who don't necessarily

(05:51):
want the extra work or expense or both of developing higher
quality credits. And so Absolute Climate's thesis
is that if methodology development could be done by an
entity that is not the registry,this potentially creates a
better system of checks and balances to make sure that we
are scientifically on track to develop meaningful carbon assets

(06:15):
independently of whether this works or not.
It's the kind of thing that I I like.
I like platform and format innovation.
I think being able to relax restrictions or add new ones and
having enough room to operate inthis space is very valuable.
I'm not sure that we have found the ideal configuration of all

(06:38):
of the parties within carbon removal and carbon markets.
I think we need more space to explore those ideas openly and
make sure that we're doing it ina way that it gets us where
we're going, increases trust, makes it easier to develop high
quality carbon assets, and make sure that what we're doing is
actually meaningful. I'll put some links in the show

(07:00):
notes if you'd like to learn more about the civil law and
common law traditions if you want to read about the
Napoleonic. Code.
Well, you're. Listening to Reversing Climate
Change, so presumably you're at least half interested in some of
the weird crap that comes up on this show.
So I'll put a little link if youwant to.
If you want a tiny taste to divefurther into legal history and
if you love things like that, ifyou can't get it anywhere else
and that's why you're here, a great rating and review on Apple

(07:22):
Podcast or Spotify would be amazingly helpful.
Also, it's $5 a month through Spotify, and you actually don't
need to be on Spotify to get it,but $5 a month gets you ad free
listening. Except for the sponsorships by
paid sponsors that I read myselfat the beginning of the show.
You can get away from the programmatic ads that Spotify
puts into the show. Thank you for listening.

(07:43):
You have a lot of choices out there.
The fact that you're spending time with me honors me
immensely. I'm so happy that you're here.
Thanks for listening and here isyour show.
Thanks for being here, Peter. My pleasure.
Always great to see you, Ross. Yeah, it's good.

(08:03):
Is there some what? Is there some clicking noise?
Oh, does that mean? Is it this thing?
Oh, is it like a fidget spinner friend?
Yes. It's literally a fidget spinner
thing. I just I'm a fidgeter and so I
like play with this, but I will not do that for the extent of
the podcast. I didn't hear it.
That's that's. Good, the good.
Audio that's impressive. Oh yeah, that's OK, That's fun.

(08:24):
Well, you're, you're a fidgeter,which means you want to do the
most detail oriented, laborious task possible in your life.
So you're writing carbon removalstandards?
That sounds about right. Yeah, Yeah.
I mean, I think carbon removal standards are going to be the
rules under which we decide whether carbon removal has the

(08:44):
impacts that it wants or it doesn't.
And so I think, yeah, but my team tends to be very detail
oriented and a little bit OCD that kind of makes us good at
what we do. But yeah, I mean, I feel like
the mission of carbon removal isso much about impact, right?
And so how we measure those impacts is paramount to whether
we're successful or not. And it sounds like absolute the

(09:07):
climate started because of a concern that we've lost sight of
this or maybe commercial interests are distorting what
you view as the correct way to set standards and methodologies.
Is that a quick read of of your take on this?
It's definitely something I've been concerned about.
I think it all started when I was at carbon 180.
I was the director of science and innovation there.

(09:27):
And this is like a little bit before DAC hubs were about to be
announced. We had heard about the program
and we were thinking through like, OK, 4 facilities,
1,000,000 tons per year each. Like that's real carbon, right?
Like this is not just an idea anymore, but we're doing it at a
scale where it's actually being something.
And so the question of how do weactually measure success?
How do we decide, you know, whether it's being done in an

(09:48):
accountable way? Like how do we think about how
we connect the projects to the atmospheric impacts that they're
going to have all of a sudden became very clearly important
and the sort of best practices are on MRV, what it should look
like, what how it should be donehadn't really been established
yet. And so that was something that
me and my team decided to dig into and we put out some sort of
what we thought were best practices around accountable

(10:08):
carbon removal from an MRV perspective.
But I think what really struck me in getting to your point is
that, you know, we think of carbon removal as this distinct
industry compared to avoidance or distinct from forestry who
put, you know, both of which have had some pretty high
profile failures. And that's true to some extent,
right? Like Cdr fixes some issues
around durability in some cases.And you know, there are others.

(10:30):
But but the conflict of interestpart like the the incentive
design piece, which I would argue has actually been the core
failure points of those other industries.
He's no different in carbon gleanable that they'll still
same failure mechanism still exist.
And So what I think the core concern that I had was how do we
build MRV in a way that is not accountable just today, but over
the long term, right? How do we make sure that at the

(10:51):
million and billion ton scale, you're still accounting for this
in a way that means something you know is accurately tied to
what the atmosphere is seeing. And I think the system as
designed today is not set up forthat, that I think it tends to
be a race to the bottom. Two big points I want to respond
with. The first one I don't know if

(11:12):
it's as. Important as the second, should
a system be built at its early stages to accommodate million or
billion ton pass through? Is it appropriate to design all
of that so far in advance? Maybe I'll just let you answer
that because I want to ask aboutthis topic, this question.

(11:33):
I think that's going to drive a lot of the show.
No, I mean, I think you're asking a really important
question. I think that's one that is part
of the challenge that we need totry to solve today, because
there's a tension between these two things that we need to be
somehow solving. I mean, I think in most
industries, what's going to happen is you're doing a bunch
of R&D, you're doing innovation.We're trying to figure out like
what's this great new problem that we're going to solve, this
new technology we're going to solve it with?

(11:54):
Once I kind of figure out how itworks, then I build it, then I
scale it, then I make it happen,right?
In Cdr, we have the unenviable position where we need to do the
science, understand how it worksand scale it exactly the same
time, which is not a situation that I would recommend to
literally anyone else. But that is the situation that
we're in. And so there is a tension
between how do we think about building IMRB and building

(12:15):
credibility for the long term, right?
What how do we create a North Star so we can make sure the
industry is going in the right direction while simultaneously
dealing with the fact that we'renot there today.
And actually, in many cases for some of these early and first of
a kind projects, it probably doesn't make sense to deploy
requirements around MRV that areneeded for the ability to on
scale, right? Like that's a waste of resources

(12:36):
and time. But I think without that, what
happens is if you allow just forthe lowest common denominator,
you remove all incentive to do better, right?
They, there's so much work that we need to do around better
quantification methods, better sensing, better models.
And if the standards don't require that, if there isn't
value in doing better, then there's going to be very little
incentive for people to actuallyinvest the time and resources.

(12:59):
And actually what happens is almost literally the opposite,
where if you are a project developer, your incentives are
to go with the standard that gives you the most credits,
right? Like that's, that's why you're
doing this. Like there has to be an economic
incentive for them. And so they're going to pick the
thing that gives them the most credits.
And if, you know, let's say 1 standard is low, has low
requirements and gets away with it, then every other standard

(13:20):
needs to meet that because they need to stay competitive in this
market. And so you're sort of ratcheting
down piece by piece lower and lower and to the point where it
doesn't mean anything at all anymore.
And so that's the challenge is like without that North Star
clarity on in in an ability to measure when something is truly
better than MRV inherently will fall apart.
It's just a question of when, not if.

(13:42):
I have some anecdotal experienceof project developers choosing
registries and methodologies notbased upon the number of credits
that they would earn and the revenue, but also the speed of
issuance. I've seen him decide upon just
general reputational effects of choosing one registry over
another. It hasn't purely been about

(14:04):
short term revenue based thinking, but it sounds like
you're making a very conceptual A prioritistic argument about
self interested people choosing low effort high reward.
The tendency is towards that wayof decision making.
Is that how you see it? I'd be more just like tasks we
have in front of us is incredibly hard, right?

(14:25):
I mean, like scaling up this industry that is the equivalent
of Lito taking a few drops of ink into a pool and then trying
to remove those drops of ink again.
That isn't an incredibly difficult technical challenge,
especially considering we're in an industry where cost is the
driver of almost everything. And so any sort of efficiencies
you can gain any way that you can sort of simplify the
problem, make it easier to scale, that's incredibly

(14:47):
important for for project developers.
Like I think it's a good thing that we are seeking what is most
efficient. I think that's important, but we
also need to realize that there's sometimes when we do
create efficiencies that are reducing the quality of the
outcome, that the atmospheric impacts are less certain or
there are more risks involved and that needs to be accounted
for also. And that's just something that

(15:07):
the voluntary market and in somecases I think compliance markets
are just not good at value that I think that we tend to want to
just treat these things all the same and say like, hey, this is
Best Buy. And so, you know, that's good
enough. Actually, no other industry in
the world operates that way. That when any industry that
cares about quality, quality assurance processes are designed
for very specific performance outcomes, like they're trying to

(15:29):
measure a thing. You know, if I'm trying to say I
am going to build a new wind turbine, I'm not going to say,
oh, I only can use this one grade of steel.
I'm going to say it has to have this yield strength and this
sort of, you know, you know, toughness, basically these
different types of performance specifications.
And I can use different grades of steel.
I can use aluminium, I can use carbon fiber.
There's lots of materials I can use as long as they meet these

(15:49):
requirements. It's based on the outcomes that
we measure success. And Cdr is maybe the only
industry in the world where thatis actually not the case.
It's possible that we're just atthe stage 1 rocket scale or
we're just trying to get off theground and then maybe some of
the stuff can come later. Like do I need to have the soil
water measurements for enhanced weathering fully thought through

(16:10):
before we start commercializing enhanced weathering?
Might not be what you're you're arguing for potentially.
I have seen people argue that weshouldn't be commercializing
tech like this until we really know answers to those questions.
But we don't really have a lot of time to be dealing with this
for carbon removal. We're we're in overshoot
territory. Like we're people I know are
talking very seriously about solar radiation management
coming online like, so we don't have a lot of time to get things

(16:33):
perfect. How should we be thinking about
that? This is why it's so tricky
because that's absolutely right,you know, like we do need to be
building and scaling. We can't let the perfect be the
enemy of the good. And there are just some of these
questions that are going to justtake time for us to develop the
technology to solve or the science to solve.
And we, you know, we need the time to do it while scaling.
So that's right. And so I think, but I don't

(16:54):
think that's incompatible with sort of the other side of this,
which is I think one thing we forget is we're not just scaling
projects and technologies, we'realso scaling political economies
with this. Like we need to convince
lawmakers this is worth investing in.
We need to build, you know, jobsthat and create workforces.
And these are things that are very hard to change afterward
that once you actually convince something someone that we should

(17:15):
build this, you get the political economies going, going
back later, you're saying, OK, we're going to increase the
standards. And now this whole segment of
this industry, oops, like those all have to go away.
Now we got the fire, all those people and all of a sudden this
thing that made sense doesn't make sense anymore.
It's just not things that you can walk back afterward.
And this has been a the classic challenge that we've had in in
in climates where we assume we can do what's directionally

(17:35):
correct and then not think aboutthe longer term repercussions
and think longer, you know, think have a North Star that
we're really trying to shoot for.
And that's been us on the butts almost every single time.
There's an example I've heard this might be apocryphal.
I'm not even sure if it's true, but whatever the the point
behind it is is true even if theexample's false.

(17:55):
But after the Clean Air Act passed under Nixon, there was a
specific filter or a scrubber for exhaust coming out of
manufacturing facilities. Because it was named
specifically in the legislation or or the administrative policy
that came afterwards. It just stayed there for decades
longer than it needed to, even though the tax surpassed.
It. Did that actually happen?
That that is a true story, Yeah.I mean.

(18:18):
If that is true or not true, there are plenty of examples of
that ethanol in our gasoline. The whole intention originally
was that cars tended to run rich, which means they had more
fuel. So to do combustion you need a
mixture of air and fuel you want.
So there's like a perfect ratio where you get perfect
combustion. Usually cars would run a little
bit rich, which means they have more fuel than air.
And it was more of like a safetything.
And so the idea was if you take ethanol, it has more like

(18:41):
oxygenation in it than it would normal gasoline does.
So you're basically like tricking the engine to absorbing
or, you know, having more oxygenin there.
So it would run leaner, so more efficiently.
So they're like, yeah, this is so smart, like basically going
to trick it. It was like literally a few
years later car manufacturers built and introduced oxygen
sensors into the combustion process so they could detect
this and then immediately correct for it.

(19:04):
So why do we still have ethanol in our gasoline?
Political Economy. Political economies, political
capture, We decided this is something we want to support and
it's been impossible to change. Ever since this has been my long
standing anxiety about policy because I've slept on policy for
a long time and Nori was not a very policy forward company.
We're very much BCM is the way to go.

(19:25):
And part of it was knowing that no matter how policy gets
written from these political economies will be established.
There's vested interest. They're very hard to change once
people are are making money off of something.
And we are not an industry that can lobby successfully against
Agri business or oil and gas or other people that have a lot of
influence. So I've always been very
cautious about how much policy we want for that reason.

(19:47):
It's not like you're saying there's no way out of it
basically. But should we be setting those
up for carbon removal now? Like, do you want that kind of
power in your own hands? A lot of responsibility.
It's kind of scary saying, oh wow.
I mean, yes, I think so. I mean, I think the reason why
this matters is because I think there is a better way to set
standards than we do today. Like today, like you sort of are
choosing between either maximum flexibility or I can have like

(20:09):
some say more rigor, but then usually gets hard coded on
specific approaches like if you use this sensor, if you use this
like. Modeling approach or you have to
gather this kind of data. And I actually think there's a
way that you can kind of bridge the gap between those two things
where, you know, our standard, the way it applies is we have
one single assessment approach for all types of CARP that we do
the exact same way for everybody.
And one of the big advantages there is that we can set what

(20:31):
that sort of quality requirements are in terms of
like what data you capture, baselines need to be what best
practices, even the definition of storage is not, is not
compatible for pretty much any other standard.
We have a consistent way of approaching that.
What's nice is you can say like,OK, you need to meet this
quality bar, and this is how youactually measure that
specifically with data. But now how you do that, what

(20:53):
models do you use, what sensors use?
That's up to you. You can now decide on a project
scale, like what works best for you.
And we're just going to measure you against how you meet our
criteria. And so there's a way that we can
build better policy that historically has been sort of
hard coded to types of Cdr or types of approaches which tend
to be very brittle. And we can design it based on
performance because that's what the thing we care about, right?

(21:14):
What we care about is how many carbon dioxide molecules are in
the atmosphere and how many of them are we actually moving and
putting into durable storage to,you know, long enough time that
it actually has an impact on climates.
Like that's the only thing that actually matters.
And so if we put the requirements on that, then you
don't have to have that same problem that we've seen in the
past. Do you feel pressure to?
Get to 100% perfection and what you're doing, no.

(21:38):
No, I think. You might be distorting it if
you if you don't get there though, in a way that is likely
to persist because you'll make people very angry if they have
to change, I think. I think, I think there's.
So are you talking specifically about how do we set the right
standards for the future long term or just how are we
assessing it today and making sure that it is accurate in

(22:00):
terms of what impacts it has on climate?
I think it relates to both of those questions essentially,
basically. The.
Example I come back to often is I would not want to be at SBTI.
Whatever you do is going to makesomeone really mad.
Depending on how you determine the LCA project boundaries in a

(22:20):
general way is going to, you know, if you're really strict
about it, some companies are going to love you because you're
going to harm their rivals and others are going to be really
upset because you harmed their business.
Like how, how are you able to? I don't know the tough position
to be in, but I wouldn't want tothat seat personally is what I'm
trying to say. I mean, I think that's actually
what we're trying to to fix here, which is not just like,

(22:42):
like right now, what the right boundary is or what is the
right, you know, level of quality.
That's basically just someone's opinion, right?
Like someone is deciding that some hopefully very smart
scientist is saying like, this is what we think is the best
practice or here's what we thinkis, is rational or makes sense.
But then a completely different scientist does that for a
different project, even in the same pathway or certainly does
it differently for a different pathway.
And so we just have basically a mix of different opinions and we

(23:04):
can spend all day long arguing who's right, who's wrong, and
you're right, Like we're never going to make progress there.
Like ultimately just becomes a series of of political
arguments. And I do think like there is
that we're never going to be able to get away from all of
those. Like some things are going to be
completely subjective and that, you know, we'll have to deal
with that in the political ways.But I think there are ways that
we can be not just more objective, but maybe more

(23:27):
importantly more consistent where if we're at least applying
the same rules for everybody, maybe the rules are right, maybe
they're wrong. Like maybe you could argue that,
but if they're at least consistent for everybody, like
everyone is, we're measuring quality and we're measuring
success exactly the same way foreverybody, then at least you can
say like everyone has the fair chance.
And I think maybe part of what I'm getting to with this, and

(23:47):
maybe what you're getting to is I think Cdr has had this belief
that, you know, we we need all solutions, like we should do
everything and there's going to be room for everything.
I'm not sure that's actually true long term.
In the short term, absolutely true.
We have no idea which is going to be the type of technology
that scales and is going to be the thing that we should really
be doing and getting just a gigaton scale, I certainly don't
have the answer to that. I think all of them have a fair

(24:08):
shot at that today. But if you look at how we build
and deploy technologies in the world, you need to have a point
in which you have consolidation,where the supply chains need to
be, you know, invested in and grown to be able to get to the
scale that we need. That doing the, you know, 10
different supply chains to 10 different methods and, you know,

(24:29):
across different jurisdictions. It's a point it's not going to
make sense forever. Wow, Peter, are you allowed to
break with Canon on that? You know what?
I often hear people say that in public and then privately I'll
hear someone be like, I'll be like, what?

(24:50):
I'm like you just. Said you just said we already
got solutions. What does that?
Mean, I'm obviously gonna have to bleep that out, but I'll and
then I'll hear them be like, it's all it's all nonsense that
doesn't even work. I'm just like, OK.
I will be honest about this and you can actually put this on the
podcast. I'm actually happy with this.
I will always be honest about this and I will be consistent
about this. Like, I genuinely don't know

(25:11):
what the answer is. Like I think what's so amazing
about Cdr is that our capacity for innovation is still massive,
right? Like not just because we have
such smart people, but because we aren't really, there's so
much room to continue to make improvements.
Like I was actually at this technical webinar today where
folks were talking about new modeling approaches in marine
Cdr where they solve on some fundamental problems.

(25:32):
Like how do you actually get things like uncertainties, like
how do you measure, like how good the model is in a case
where you have, you're sort of over constrained the problem,
like you have too much data and you're not sure like which is
the right data to use. Like historically, what that
means, it's like, OK, we don't really know what kind of model
to you is. We just have to rely back on
experts, you know, like who's opinion is that 1 is better than
the other? And this is a new approach that
says no, no, no, no. Like we can actually propagate

(25:53):
those uncertainties through. And we have a way for actually
like calculating a score of how well the model works based on
that set of data. And that is transformational for
Marines here. That's something that they
haven't really had. And so that's just one of many
examples where the type of innovation that we're going to
need to build more trust, to build more certainty to scale
Cdr, a lot of it doesn't exist today.

(26:14):
And that's a good thing. It means that we the fact that
we don't know which is the solutions that are going to
actually really work long term is a good thing.
It means our capacity for improvement is still massive.
Maybe just set out the vision for absolute climate and restate
the problem that you're trying to solve and why you think,
Especially if you're expressing doubt about your ability to

(26:34):
know, which is a characteristic I respect.
But the job that you've undertaken is seemingly to be
the knower and something. Maybe you disagree with that, or
maybe maybe you don't, but how do you feel about that?
I think my job, the job that I think we should have, and I
think I personally think the entirety of the MRV field should
have, and I don't think everyonetakes this in the same way.

(26:55):
May disagree with me, but I think I feel pretty strongly
about this is our job is not to make a determination of what's
good and what's not. Like there are going to be some
decisions that society needs to make.
I'm like, where do we want to allocate resources?
How much certainty or uncertainty are we comfortable
with? And, and just are there other
reasons that we may want to scale some solutions or others?
Like those are societal decisions that I think live
should should live outside of the purview of MRV, defeat

(27:19):
conflicts of interest to make sure the incentives are aligned
and to make sure that I think we're doing our job well.
MRV should be purely about how are we accurately and
transparently assessing what theatmospheric impacts are of these
projects. And there may be a political
decision after that of like, okay, we understand this and we
don't care. We're gonna do this anyway
despite this uncertainty or thisis something that we think is

(27:40):
important. And so let's go forward.
Despite what we understand abouthow the minister is being
impacted, Like maybe we decide, okay, we know that it takes 1000
years for the removal to really match the emissions damage that
is being caused, but maybe we don't care about that.
Like we should just do 100 years.
I think it doesn't mean we don'tthink about the thousand year
time frame. We should still be evaluating
the atmospheric impacts from that perspective.

(28:01):
But then there could be a political decision after that
that says like, OK, but you knowwhat?
We're going to do this anyway, even though we know this isn't a
perfect match. Interesting.
So your attempt is to be values neutral.
Otherwise, without that, then myincentives are going to be to
change the rules and change my standards to be as attractive as

(28:22):
possible to project developers or governments even right, who
actually struggle from the same incentive problems.
Like the only way I think for usto keep the standards honest and
to make sure that MRV is giving us the information we need is to
take them out of the political process.
Like they should be tools for policy.
They should not be dictating forpolicy.

(28:42):
Interesting. Do you know much about the
common versus civil law traditions?
Is this familiar ground for you?I don't think so.
So OK, I'll probably have to do this in the intro just so people
can get a good basis for understanding it.
But the Anglo American traditionis primarily a common law, which
is judge made law. People would bring cases to
judges, judges with discover newlaw, they would apply it in new

(29:06):
ways. And we build on it on precedent
and stereo diseases where you'rebuilding up a case history, that
of decisions that can be pointedback to.
But the civil law tradition is famous from Napoleon of coming
in and trying to state by statute basically everything.
And so rather than trying to find and discover new precedents
or, or go back to older precedents, that's very much

(29:27):
about referring to statute in textual analysis.
And I tend to have more faith inthe common law tradition because
I like the ability and the flexibility of working from
general principles and and learning.
In law in that way that always. Intuitively, that makes sense to
me. Where is that?
The Napoleonic Code and civil law?
The goal was to take legal discretion out and to make it

(29:48):
all about technicians of merely applying the law.
There are trade-offs, of course,because there are trade-offs and
everything. And I'm wondering to what degree
do you prefer something like howthe civil law tradition was just
described or if like me, there is some value in maintaining a
common law tradition within carbon removal.

(30:09):
Or maybe maybe we don't know exactly how alkalinity and
wastewater treatment should be dealt with right now.
We're going to adjudicate it andthen we're come to a general
principle and that will come into it later.
But not everything needs to be decided as early as possible for
a generalizable totalizing standard, which I face a serious
risk of the political economies that you mentioned.
Yeah, I think that's generally true.

(30:30):
I think there's one sort of, at least for me, like this driving
challenge that makes the common law.
I think it's actually a really good example that makes that a
challenge, which is tell me if I'm wrong about this.
I'm not a legal expert, but my understanding of common law is
that the challenges to it are driven by farms, right?
Like if someone has been hurt, that is the basis for how you
challenge a common law, you know, an existing common law, it

(30:53):
was priority, right? Like you say, like, hey, this
isn't working. Here are examples of how that's
not working. That is generally the process we
use across a lot of society, right?
That is uniquely challenging in carbon markets.
And that's because the harms areextremely diffuse.
They're not felt by any specificindividuals.
And the lag in which we we experience them is extreme in in

(31:14):
very hard to attribute to one source or another, right?
Like the harms we're going to see will be decade.
If we're wrong, let's say we aremassively over credited, we're
getting this wrong. We're not going to really know
from a first principles perspective until a decade or
more later after we've had so much unbelievable harm that is
accumulated from this mistake that we're actually seeing it in
global models or, or our experience in how the climate is

(31:36):
changing. And so now attributing that that
that harm to now an individual or a specific action becomes
almost impossible. Like we can't get around the
fact that CO2, like you and I are both sitting in rooms that
are filled with carbon dioxide and it's invisible and it's a
nerd and it's highly diffuse. And so if we could do a better
job of establishing those harms and then saying like, hey, like

(31:57):
we see this problem and now we need to change it, then that
would work better. But I think alkalinity is a
great example where right now, if you look at how most
standards are built and how we deal with those uncertainties,
it's we'll do something like make your best guess.
And in some cases, like maybe discount away what we think is
sort of a conservative confidence interval.

(32:18):
But in many cases, we're not sure if that actually is
conservative or not. And then we're going to pretend
like it's not a problem anymore where that actually isn't the
scenario where it gives us much room to say like, yeah, like
this actually is a problem. Like we need to keep innovating
on this. Like this, our certainty around
this is not enough for us to actually really build the whole
industry on top of this. And so we should be transparent
about that. We should be clear that this is
an unknown and not pretend like we can sort of deal with it even

(32:41):
though we don't have the solution today.
They're good points and the the point you make about torts and
tortious claims are why regulation tends to step in in
place of torts in common law tradition, because it's too
diffuse and it cannot be proven in court.
And it's not the same as so and so dumped something terrible on

(33:01):
your property and you can point to that doesn't work that way.
And so yeah, something somethingmore statue based may just make
more sense for this. Just.
Given the the vested interest portion of this, just like how
do you build in a system that has some amount of flexibility
that we don't end up with ethanol or that GE filter or

(33:22):
scrubber, but also allows peopleto plan very far in the future.
Because we don't want to be likedecide right now that OK, Fikers
is the most important or the worst of all Cdr methods and
we're just now stuck in that forever.
How do we learn and grow in a system that does have a
totalizing impact? I think from my perspective,

(33:44):
there's only one answer to that question.
I think it's just we should based on performance, like a lot
of policy today has been based on like, I want to support this
pathway, a lot of it going to direct air capture.
Is that right? Is that not, I don't know.
I can tell you that a lot of it was designed based on like that
pathway specifically. And if what we care about is
really climate impacts, then what we should care about is
climate impacts and how we measure those.

(34:06):
And so why not build policy, build approaches, build MRV
based on performance. And what's nice is like, yes,
you do have to set sort of like here's the bar, right?
Like here's what we think is theright performance.
But if we're wrong about that, that is something that can be
adjusted over time, right? We can decide like actually
maybe we're overzealous or maybewe weren't tight enough, but we
are clear about what is needed and how it was needed and a

(34:28):
project has flexibility in how we get there.
We don't have to hard code that to any specific pathway or
approach. You might have to choose some
things like if the ultimate determinant would be success or
failure being linked to parts per million in the atmosphere
versus radiative forcing or something is is determined to
success. Granted, these two things are

(34:48):
there's a relationship between them and ideally you could
specify that and convert. But is there not some sort of
fundamental like disagreement that people could go one way or
another on a certain issue like that that will distort the
outcomes of what kind of carbon removal is built or not built,
right? There are probably some cases of
that. Yes, definitely.
And this is gonna be a situationwhere it's gonna be impossible

(35:08):
for us to find some perfect solution.
Unfortunately, across all climate, including an MRV, there
are no silver bullets. There are only more lead
bullets. And so I think there is a way
that there are always gonna be cases where we have to just sort
of do our best and try our best to may changes later.
But I think we should just acknowledge how hard that is.
But that's not a simple trivial thing.
And if there are opportunities for us to be looking at the

(35:30):
bigger picture, creating consistency and setting the
north star of what we think goodlooks like today and making that
very clear to projects like, hey, it's okay that you can't
meet this today, but probably atsome point you're gonna need to.
And if you can't, that's gonna be a problem.
But if we don't make that clear today, we build that into
policy, we are likely to get that lock in on subpar solutions

(35:50):
that are politically popular forone reason or another, but
actually may not help us get to what the climate impacts we're
looking for. I'm so intrigued by you and this
approach, Peter. I like that you're trying to not
add in more judgment that is necessary to your job.
Well, because if you had very strong opinions about some of

(36:12):
these matters, it might feel just as political or
self-serving as as many others who are trying to set their own
standards. But it doesn't feel like that is
as prominent here. And I think now is the
appropriate time to get into conflict of interest.
Question. Everyone talks about it.
It's either not that big of a deal or they made some
innovation so that it's less of a big deal.

(36:34):
How should we be thinking about conflict of interest in carbon
removal? It's maybe the hardest problem
that we need to solve. I think actually much harder
than the science or the engineering or anything else.
Like I think like we are in this, I think like difficult
position where when we're removing carbon dioxide from the
atmosphere, whoever's buying that is not getting an Amazon
package filled with CO2, you know, like they can then measure

(36:55):
and check on afterward. Like there actually really is no
products. Like what you're getting is the
data, The M, you know, the MRV truly is the products.
And so all they can really assess is, you know, what that
data looks like in the back. The reason it very rarely is
this the source of truth. And that just makes our job
really hard. I think where, you know, it
makes it much easier to be in a situation where now, again, to

(37:16):
be competitive, you might have folks who degrade standards to
just make it sort of more attractive or, you know, making
mistakes and then, you know, maybe even like unintentional
mistakes. But then not wanting to have to
look your customers or your stakeholders in the eye and say,
oh, yeah, I made a mistake here.And, you know, now I'm going to
do this hard thing around fixingit.
I mean, won't have any names, but there was recently a big

(37:37):
registry who had an issue with, you know, one class of their
credits that, you know, academiahad pointed out.
And it took them half a decade to acknowledge and, and actually
fix it. It took a really long time.
And why did it take long? Like, why did they not fix it
sooner? There is no reason to, there's
no incentive. There's nothing that held them

(37:58):
to make that change. There's no there was no downside
to being wrong. And so I think the unfortunate
challenge is like, I would love to say like again, silver bullet
if we just fix this one, You know, this one little trick like
negative incentives hate when you do this.
It's just not like that. Like incentives are like every
organization has incentives for profit, nonprofit, academic,

(38:19):
civil society, governments, everyone has incentives.
And so the trick is, is how do you build a system that aligns
those incentives in all the right ways?
So that's, you know, hopefully one definitely the climate is
getting what they're paying for,but also, you know, like the
people, whoever's purchasing, purchasing this is getting what
they're paying for. And so it's a it's a complex
challenge. I don't think no one for being
honest with each other. No one has the full solution to

(38:41):
yet. Can't you undermine my follow up
question? I was hoping you were going to
tell me what's the point of thisconversation, people?
I mean, I can tell you ideas. I can tell you how we think
about this. What do we do about this?
Absolutely be friendly. This is an incomplete solution.
There's more that needs to be done.
But for example, like, you know,I think one of the core
challenges we have is who is actually designing and building

(39:03):
these, these standards or who's deciding what quality is right?
Like in, in a lot of cases, it'ssingular organizations, right?
I mean, and we are definitely inthat bucket right now too, where
I, I think like, you know, there's one person's opinion on
what you think is right or wrong.
And I think that problem actually gets compounded in
this, in this scenario, the sortof market that we have today

(39:25):
where the folks who are generally building these
standards are also the folks whoare running registries or
selling or brokering credits. And from our perspective, that's
just a massive competitive interest.
It's the same people who are deciding what's good enough or
the same people who are then, you know, issuing or selling
credit. It's like, it's sort of like if
you were in school. And it's like I write the test
and I grate myself against that test.

(39:46):
It's like, you know, like it's asingle point of failure.
Like one organization could decide I want to reduce quality
in order to issue more credits. And there's just no check and
balance against that. And so we're trying to fix that.
For example, where we are intentionally deciding to not
have a registry. We're not obviously doing
anything around selling or broking credits.
You know, we are specifically quality assurance.
That's our job. Our job is to think through and,

(40:06):
and you know, be accountable forhow we just determine whether
the atmospheric impacts are happening.
And then we partner with registry, we partner with other
folks who can then adopt our standard.
And so. Well, again, acknowledging that
that is not a perfect solution, but it does create now a
separation of power without there's not one organization if
you just decide to make a changeand they're not accountable to
anyone, there is at least one layer of accountability.

(40:27):
And so that's an example of something that we're doing that
we think will hope hopefully putus in in the right direction.
I have so much. To say and to ask about this and
I suspect much of it will find its way into the introduction.
But so with Nori, we were vertically integrated and we're
a registry. We developed our own
methodologies. We're like a quasi project

(40:48):
developer too because we had credits being issued and also
just obviously a marketplace. This type of integration put us
out of step with ICRO and ICDCM.So we could just not qualify for
those stamps. And then very famously, Puro was
originally a competitor of ours,but then disaggregated.
And I'm wondering because if youare a registry and if you choose

(41:09):
to remain only a registry, the only Moat that you have is the
methodologies that you develop and issue credits through.
And that is a direct response toICROA setting these ex ante
rules. Or maybe it's better to even say
a prioritistic rather than ex ante, sort of like precautionary
principle based rules of this isjust an incorrect format.

(41:31):
In order to legitimately do thisactivity, you cannot do both of
these things. Unfortunately, it means that
registries have to own methodologies and pay well them
and issue credits through them. So one of the things that we
were able to brag about is that we released our methodologies
under a very permissive CreativeCommons license because we are
vertically integrated and did not need to pay well the
methodology portion to run a successful business.

(41:55):
It about. I mean there's conflicts of
interest in any type of organizational structure, but
the one that you're describing is basically baked in by the
quasi regulatory bodies that oversee BCM.
Do you agree with that characterization and how much of
a problem is it actually? I think that is how things have
worked out. And yeah, maybe those have come
through these these accreditation bodies like they

(42:16):
had sort of led the way. I think, I don't think I have
enough information to say eitherway.
It's certainly like sort of the,the, the business model first
choice for a registry is like webuild proprietary systems and to
use our systems you need to livein our all wild card, right?
That's just the way it works. But I don't think it has to be
done. And I think we're trying to
demonstrate a way that's different because if you really

(42:37):
think about it, what are the roles and responsibilities that
exist when you're talking about credit certification?
I think there are sort of two very distinct roles that usually
a registry plays both roles. You know, 1 is the quality
assurance, like how do we assesslike whether this is actually a
quality thing and, you know, buyers getting what they're
paying for it. And then the second step is
credit issuance. It's running a registry, making
sure there isn't double accounting, you know, make sure

(42:58):
that the tool is in place so buyers can, you know, retire
them and, you know, make that process really easy.
And these really are actually like very different skill sets,
like the two, like being great at one doesn't necessarily make
you create at the other one, butboth are creating very specific
value. Like those are both important to
the process. And So what we're doing is
actually quite simple. It's just we're going to split
those up where for the quality assurance part like we do that

(43:20):
and and there we get paid for that work because it is
important work that needs to be done.
And then our registry partners do the credit issuance and
credit management and that's also important work and they get
paid for that. And so I think we just, we've
gotten comfortable in the in a model where we take these two
very different things, smash them together and we expect that
they should be together. But but why is that?
Like is there a reason why they're somehow actually better

(43:42):
or more efficient when they're actually put together?
And would love to hear counter arguments if you have them, but
it's not clear to me that they are better.
Oh no, I, I it's very empirical for me.
I, I don't love the a prioritistic saying that certain
business models are inherently unworkable necessarily because
there are also cases where vertical integration works very

(44:02):
successfully and has very high quality.
And I, I tend to for the same common law type of evolutionary
faith. I, I think I want to make sure
that we don't winnow out too many design pathways that might
lead to things that are unexpected in the future where,
oh, I wish we didn't just say that this is just inherently
wrong. We, no one should ever
investigate this or build a business in this way.

(44:24):
I like that you're doing things differently.
I also like that it's because you are a commercial entity.
Yeah, correct is to to make somemoney and good for you.
I hope you hope you all do. That's a really interesting
thing. I imagine though your thesis is
is very antagonistic to the way things are done now, which is

(44:46):
neither good or bad and something about your disposition
or how you're presenting this. It's because your business is
inherently a criticism of the way things in carbon markets are
done. Which is fine.
I mean, Nori wouldn't exist if that weren't the case too.
In fact, you probably shouldn't start a business if you don't
have a criticism of the way things are done now.
What would even be the point? Yeah, but I imagine it probably
doesn't always win you the most friends, especially when people

(45:06):
like Isometric. I've heard them talk about the
way they structure who pays for the registry services as
deconflicting the process to an acceptable amount, and it sounds
like your business is inherentlya challenge to that saying No
it's not. I mean, I mean, I think there is
no one solution. It's probably going to be a
series of a multiple different changes made to be made.

(45:27):
But I think the proof is in the pudding, right?
Like I think a model where we connect the registry and the
standards is one that we've tested many times and it is
effectively always been the point of failure.
Like that is where it has fallendown every single time.
And so to think that somehow Cdris going to be magically
different, I think that is just,it is magical, is magical

(45:48):
thinking. And so I think there are
structural changes that need to be made.
And we obviously have very strong opinions on those.
People may have different opinions.
And I don't think it's clear what's right.
Like, I think it's good for us to try different things.
I guess I would maybe challenge anyone who said like, hey, we do
this one thing and all of a sudden everything's better.
Like I think the challenge wouldbe like, show us that's better.
Like show us that that's the case.
And in my experience, like there's so many different

(46:09):
dimensions to conflicts of interest that are just, it's
hard to say like one thing is going to somehow fix anything.
Maybe it makes it better and, you know, maybe it's good enough
for now, but with a separate argument.
But the idea that this is not a problem, I think is a mistake.
I think this is something that'll be a problem for the
duration of this market even exists.
OK, let's put on our white hat hacker hats for a second.

(46:33):
How do we break absolute? Like how do we thoroughly
corrupt your institution and ruin what you're trying to
achieve? Like what's the weak?
Great question. I think one weak point that we
have is that we are still as of now, like the sole primary
authors of, of our standards obviously work with academia, we
work with experts, you know, youknow, we gauge with those folks

(46:55):
too. And so, you know, the end result
is something that we, we feel is, is the culmination of
expertise and, and knowledge across the industry.
But like, we are the sole, like we, we decide, you know, like
what's good enough, like we are the sole decision makers there.
And I think very likely that's something that's gonna have to
change at some points. And so like, you know, from a
corruption perspective, like we could decide like, hey, we don't
really care about climate anymore more.

(47:16):
We want to now just make a Better Business and be more
attractive to project developers.
Like there's a lot that we couldchange to corrupt the way that
we think about creating, you know, quality outcomes and
consistent outcomes that would be much better from a commercial
perspective. And so that is, and there's
literally just a decision that we could make at some point.
We are not making that decision and we have no intention to make
that decision, but we could. And so that is like, you know, a

(47:37):
fifth failure report that existsacross the entire industry where
like there are these single points of failure where it's
very difficult for there to be checks and balances in place.
I mean, we recently saw this. I also won't name any names.
Like we saw a situation recentlywhere there was an NGO who
called out an issue with a specific set of standards and
nothing happened basically that that sort of warning was ignored

(48:00):
and credits were issued anyway and buyers accepted the credits.
And so just like for me, it was like such a example of where
this can go wrong, where if we can, you know, if the industry
itself decides like, hey, we don't care about this problem,
we think this is good enough, we're going to move forward.
What's the stop then? I mean, that's a question for
you. I mean, I don't know, really

(48:20):
hard question. I'm sort of surprised whenever I
hear things like that, that buyers aren't more persnickety
about it. You would think they would
defend their interest more because they're going to be the
ones left holding the bag if it falls apart.
I mean, for some of these thingsthat are more innovation
oriented that are small numbers of credits, you can just say
like it's a write off, write ourbest.
It's an early industry kind of thing.
We're not using it against our fossil emissions or something

(48:41):
like that. But I don't know, that's like
partially OK, I guess. It's tricky, right?
I mean, in some cases, I think there are unfortunately I think
a problem that we have in the space right now.
There's a small number of buyerswho have the technical capacity
and time honestly to do enough diligence to say what's right or
what's wrong right. Like they're very, very small

(49:02):
number of buyers who have very strong opinions.
And that's, that's actually a problem, right?
Like, again, that doesn't exist in any other industry.
We're like, if you go to, I was in Zurich, Switzerland last
summer, Lint, you know, lint chocolates.
They had a little a factory there, like a factory tour where
you can go as a, as a, a visitor.
It's very cool. Like it's very cool to see a lot
of the Swiss engineering, you get to eat a lot of chocolate.
One of the things that I love the most is they had a whole

(49:22):
little section on effectively like their version of MRV.
So quality assurance. And they talk about like, how do
we make sure that we, you know, from a, a, a very wide variety
of different inputs, like different feedstocks that come
in from different parts of the world.
How do we make sure that our chocolate is great every single
time, that every single bar meets our quality criteria and
that our, our, you know, our customers are going to going to

(49:43):
love and enjoy it. How do we make sure?
And they do an intense number ofdifferent things, but a lot of
it is that they are not, they rely on standards, but they
don't rely on them entirely. Like they make sure that they
are double checking the results like they are in some cases,
like checking every bag of cacaobeans with sensors to make sure
that it meets their bar of quality.
And so I think there's a sort ofa problem here where the buyers

(50:04):
don't get to see what they're getting.
It's really hard for them to check to make sure that like the
quality has been delivered as promised.
And it gives a lot of leeway to Sanders organizations to project
developers, to the industry as awhole to kind of do what they
think is best. Surprising though, where it
carbon Plan or any of the various watchdog organizations

(50:24):
write a. Piece about something like that.
And you would expect some of thebig buyers to take that pretty
seriously. Or maybe they just have.
Maybe it's like the British royal family where you never,
never complain, never explain. And you're just sort of like, I
don't want to Streisand affect this and draw more attention to
it by referencing it. Like, maybe maybe it's better
for the company to take this oneon the chin, but also also not

(50:47):
to draw too much attention to it.
That might be the charitable explanatory way of behaving.
I mean, it's a challenge, right?Like I don't think we're in this
situation right now. Like the buyers are all doing
like are all very thoughtful. We're very lucky to have this
set of buyers who are thinking so critically about this.
And a lot of them I think are are willing to take risks like
they are intentionally doing this really hard, really risky

(51:07):
thing because I think it's they think it's better for the world.
And so we're very lucky. Like it's like that could maybe
explain a lot of it, but I thinkthere's this other part where
like I think long term we also need to recognize that the
buyers incentives, you're right or not to hold the bag, but
there's lots of different ways to not to hold the bag. 1 is
like make sure you're buying stuff that's unassailable.
But the other way is buy stuff that's really hard to prove that

(51:27):
it's low quality. It is shown to be low quality.
Like don't talk about it. Like like that also allows them
to not hold the bag. And so I think like instead they
get back to the incentives beingreally complex.
There isn't just like a people have this one very clear
incentive. There can be multiple different
shades of that, some of which are great, some of which can be
disastrous. Yeah.

(51:47):
There's, there's so much here, Peter.
You're working with the registryright now though.
Yeah, you're commercializing. How's that going?
It's going amazingly. We feel really lucky to have
them as a partner. So they're a company called
Evidence. They are historically coming
from the renewable energy creditREC space where they've been
operating for two decades. They've issued more than a
billion credits in 60 countries on thousands of projects.

(52:10):
So they're one of the big players there.
And I think an organization thatreally understands how to do
credit issuance and certification incredibly well
and they end up really trustworthy, high accountability
way and they now are expanding into carbon removal.
And so to do that, you know, I think they share a lot of the
values that we have around the importance of trust and
consistency. And so they've adopted our

(52:30):
standards. So they're using our quality
assurance approach, our standards and modules to you
know do that the credit issuanceon their registry.
Is direct our capture and storage going to be the first
methodology? That has been our first
methodology. So actually the we just released
it for public consultation that just closed and we now need to
do all the responses to the great comments that we got.

(52:52):
So that's kind of the next big thing on our list, but we're
actually already working on other pathways.
So I think one we can mention because we've made the
announcement publicly, we're going to be doing coastal
alkalinity enhancement with a company called Besta.
So I think one of the sort of most interesting ways for adding
alkalinity to the ocean in a waythat like is actually quite
measurable. I think Besta has some really
interesting innovations around how they can get better data

(53:14):
that aligns for really well withour standard.
But we're actually already now starting more than one other
pathway in parallel that we'll be hopefully really seeing in
the next. Several months and those will
have credits issued through themon evidence is the plan.
So, yes, the plan right now, so right now they're the only
registry that we are partnered with.

(53:35):
But that is that is not our model.
Our model is that we think competition is generally very
good in our industry, like we should be competing on price and
we should be competing on scale.So one thing that we think is
bad, that this is where the raceto the bottom comes in.
We don't think we should be competing on the definition of
quality, that we think that should be something that is
consistent across all industry. And so our model and our
registry partners know this and are actually quite appreciative

(53:56):
of this is that we want to partner with other registries
too and in other types of organizations that we think that
sort of this quality layer should be the thing that is
ubiquitous and consistent acrossall the different market
mechanisms that we'll be dealingwith with Cdr.
Did you ever read the book The Wide Lens?
Do you know about this? No.
Just pick up a copy. If it's, it's a, it's a

(54:18):
collection of case studies aboutecosystem plays and when they
succeed and when they fail. And what you're doing is so
interesting because it's one of those things where I could see
project developers not necessarily wanting to flock to
you for their own reasons. The registry that you compete
with don't necessarily want to fire their science teams in and

(54:39):
outsource a lot of that responsibility to you and pay
you as well. And then they have less assets
that are valuable in case of line down.
What do you do in a case like that?
You can see you start over with a new registry that's looking to
break into it. I mean, they're old, but like
they're breaking into carbon removal.
But is the goal to have pureo and isometric and reverse come

(55:00):
come hang with you? I'm not sure that's your goal,
but tell you, is that a likely outcome?
I mean, I think definitely like we want to work with everyone
eventually. That is definitely our goal.
And I think like the way I thinkabout this is there's sort of
short term benefits in competition and there's long
term benefits in competition andthe short term, sure.
Like we're in the industry that we are today.
It's a sort of this dogfight in the Wild West of like, oh, mine

(55:22):
is better than yours, except I can't prove that.
And you know, like just my opinions are smarter than you're
better than your opinions. So fine, like that is where we
are today. But if you don't take this
approach and you send it out to where we are likely to be going,
which is I think integrating Cdrinto governments and cap and
trade in compliance markets, then you can sort of better see
what the challenge that we're going to be facing there where

(55:43):
like we have the same dynamic that's happening.
There's two sort of outcomes that happen here that I think
I'm deeply afraid of personally.1 is we're going to see the same
sort of like mindset of like everything needs to be built one
off, get custom in house. We do it our way because like
we're smarter than everyone else.
And we're going to end up with this 100 different countries who
have integrated Cdr to their, you know, government structure
rates, but they're all going to do it differently.

(56:05):
And that's not just bad from like a climate perspective, Like
we'll have actually no idea how much climate progress we're
happening because everyone's going to be measuring this
completely differently. But now imagine that from a
project developer's perspective,like imagine you have to comply
with 100 different types of standards.
It's insanity. And you know, there are
mechanisms for maybe fixing thatlike Arc 6 and and and whatnot,
and you know, maybe even SPTI. But they're not 1010.

(56:28):
They're not moving in the direction where actually it
looks like they're actually try to solve this.
So that's the one thing I'm really afraid of.
The second thing is we forget that Article 6 isn't just about
how we measure quality, but alsohas other things, for example,
builds, things like emissions training capabilities, which I
think can actually be a really good thing.
There should be certain types ofjurisdictions or certain parts
of the world where they might bebetter at director Capture, they

(56:48):
might be better at Hanford Weather.
They see this as an opportunity to build new industry, which I
think it's good. Like that's the way we're going
to actually get adoption of Cdr.The challenge though, is that
same race to the bottom mechanicthat we've seen in the voluntary
market without really rigorous consistent standards across
governments and across like how we think about this, that could
actually replicate itself at national scales where countries

(57:09):
will want to compete to bring product developers and bring
economic opportunities. And they might think like, hey,
maybe I'll just make slightly weaker standards because they'll
make an issue more credits and that'll look great.
And all of a sudden, like that same race to the bottom
mechanic, he's spread across theglobe at gigatons.
That is the nightmare scenario that I really spend time
thinking about, worrying about. Dang, this one really made me

(57:31):
think. Peter, I asked you hard
questions too. I hope you don't feel on the
defensive, No. These are great questions, I
love them. Yeah, I'm fascinated by what
you're trying to build. I'm glad there's more people
thinking about this. I also just respect that you
took an orthogonal view to everyone.
Else like. Do you even have any competitors
besides just the registries who are integrated in the way that
you describe? Even them like I, I imagine that

(57:52):
they are not our competitors, like we are happy to be partners
with them. Like, you know, like we're
effectively like they are our customers, like that's how we
integrate. And so we're happy to work with
them when they are ready. I think you're probably
generally right that right now it's sort of hard to say like we
don't, we have a, we used to have opinions on this, but maybe
we don't anymore. Like I think we have more work
to do to demonstrate that like there's value in this and
creating consistency across the industry is going to be

(58:13):
beneficial. And I think that's absolutely
true. Like, you know, one of the sort
of hard facts that we need to grapple with in this industry is
we effectively have like 2, you know, maybe if you want to be
generous, like 5 buyers and that's it.
Like, you know, those five buyers are basically putting the
entire industry on their shoulders.
And that is really problematic. Like that's a, that should be, I

(58:33):
think it is like, like it is alarm bells for almost
everybody. But I think one thing that we
need to do a better job of is asking ourselves this question
of why, why are there not more buyers?
And I think part of it is definitely like cost, it's
expensive and there's sort of noreason to do this.
We know until we have regulation, that's not going to
happen. So I think another thing that
has been highlighted to me in the conversations I've had with
buyers is also that they've seenthis rodeo before that like when

(58:57):
they bought forestry, when they bought avoidance and had that go
wrong for them, have to go poorly for them.
It's because they, it was the MRV wasn't sorted because it
wasn't consistent, because it was very clear that they weren't
able to get, they weren't necessarily going to get what
they were paying for. They can see the complexity in
trying to determine whether I should support one project or
another. And they're not fools.

(59:18):
They're actually quite smarts. And so if you're going to be
putting down millions of dollars, do you want to do that
in a situation where you like, you're confident that you're
actually going to be able to make a claim in the long term?
Or, you know, are you OK with massive uncertainties?
And it turns out we're very lucky to have a small number who
are OK with massive uncertainties.
But actually maybe that's not the model for the larger market
that we're going to have to build.

(59:40):
Very rarely do I come away from a show with more questions than
I started. There's so much to talk about
here. We'll pick another one of you
some other time. Peter, thanks for being on.
Thanks for enduring the the barrage of questions for for
waiting in the legal theory I'm trying to compare to legal
reviews. Thank you.
It's been my pleasure, Ross, like always fun to talk to you.
I think this has been one of themore stimulating conversations

(01:00:01):
I've had in a while. And like, getting this level of
complexity is how we should think about this.
Like, if we want to succeed in Cdr, it's the question we should
be asking ourselves is how do wemake it more like legal theory
in law? How do we make it more like, you
know, financial regulation? How do we make this a system
that the the rest of the world has already shown us is needed
to build scale, to build trust? We should be thinking more in
those terms. And that's how we solve this
larger problem.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.