All Episodes

May 21, 2025 • 32 mins

Litmaps, a Wellington-based startup that’s just raised $1 million in a Series A funding round, is redefining how scientists navigate the sea of academic literature by merging citation network analysis with generative AI. 

Founded in 2016 by Axton Pitt and Kyle Webster, the platform now serves over 2 million researchers globally, including institutions like Harvard, Stanford, and the University of Cambridge. Its mission? To "accelerate impactful science" by helping researchers identify gaps in knowledge and avoid redundant work.

The playbook is the same pursued by many Kiwi startup founders with Kyle and Axton, who was a University of Auckland biomedical and computer science student at the time, growing frustrated with legacy systems poorly equipped to meet the needs of modern researchers. 

“We just found the tools that you had to use very clunky,” Axton told me on this week’s episode of The Business of Tech. The acquisition of rival Research Rabbit now gives Litmaps an audience of millions worldwide. Find out how better scientific research discovery, powered by AI and visualisations, is changing how science is done on episode 99 of The Business of Tech.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Welcome to the Business of Tech powered by two Degrees Business,
the podcast exploring the innovators shaping our digital future. Today,
we're joined by Axton Pitt, co founder of Lipmaps, a
Wellington based startup that's transforming how scientific research is discovered, visualized,
and connected. With the background in biomedical science and computer science,

(00:28):
Acton's journey began with a personal mission to make an
impact in health and technology. Lipmaps recently made headlines with
its acquisition of US competitor research Rabbit and a one
million dollar funding round, bringing its powerful AI driven research
platform to over two million users worldwide. In this episode,

(00:50):
we'll dive into how Lipmaps is using artificial intelligence to
accelerate impactful science, the challenges and opportunities of building a
global SaaS business from New Zealand, and what the future
holds for research in the age of AI. So here's
Axton Pitt on episode ninety nine of the Business of
Tech acctent Pitt. Welcome to the Business of Tech. How

(01:20):
are you doing good? Thanks thanks for having me. Well, congratulations,
it's been a big couple of weeks for you. You've
done a big acquisition a competitor research Rabbit that's pretty cool.
You've raised a million dollars to fund your expansion and
that acquisition will give you what two million users researchers
around the world using lit maps, So that's very impressive.

(01:44):
So we're going to get to all of that and
talk about how you're using artificial intelligence as part of litmaps.
Just start by telling us a bit about your own background.
From what I can tell, computer science, biomedical research, that's
really been the sweet spot for you in terms of
your studies.

Speaker 2 (02:00):
Yeah, so I studied undergraduate degree in biomedical science and
that was sort of getting one of the journey of
what's possible about the human body and trying to make
an impact in some way. And then I fell into
really the computer science realm, where I was programming as
a hobby and sort of tried to combine that with
that interest that I had based on a previous health incident.
Basically say, look, I had severe asthma as a kid.

(02:22):
Can I do something contribute in the medical space. So
that's where that's where those two interests sort of came from.

Speaker 1 (02:27):
Okay, so is that when you're at University of Auckland,
is that what your area of research particularly looked at
some sort of answer to asthma.

Speaker 2 (02:35):
No, just in terms of my personal backstory at a
bachelor level and taking those interests trying to translate it
into maybe a PhD pathway. Wasn't quite getting over the
limit there to do it, and then found computer science
and was lucky enough to go to San Francisco and
attend Apple's Developers Conference, which really kicked off this sort
of software and tech realm for me.

Speaker 1 (02:54):
Yeah, I've been to a few of those, the WWDC
I think they call it epic conference and big health
tech focus as well. What you can do now with
Apple devices is pretty impressive.

Speaker 3 (03:07):
Yeah. For me, that was the quantified health sort the era.

Speaker 2 (03:10):
You know, things like you know, tracking your steps, your
heart rate, these sorts of things.

Speaker 3 (03:13):
But I sort of found it.

Speaker 2 (03:14):
You know, all those metrics can be a bit overwhelming
as a human and you're sort of like, what steps
should I be taking based on my particular case. And
so that's some of why I got into it, trying
to figure out, you know, what are the limits for yourself?

Speaker 3 (03:26):
Is it ten thousand steps or is actually eight thousand
depending on who you are and what you're doing.

Speaker 1 (03:30):
So there's that area of research that is particularly interesting
what you can do with that sort of data. But
in terms of lit maps, give us the genesis story
of that. You were at the University of Auckland and
what you identified some sort of problem around how people
were trying to find research papers in the course of

(03:50):
doing their own scientific research.

Speaker 3 (03:52):
Yeah, that's right.

Speaker 2 (03:53):
So me and my co founder Kyle Webster, we basically
came together and said at a student group actually called Kaias,
really looked at you know, he was struggling with his
PhD literature review process. So a lot of graduate students
struggled with this early first year of their PhD. What's
already been done? And you know, can you make a
novel contribution? And so we just found the tools that
you had to use very clunky, you know, these were

(04:15):
sort of nineties era web tools where you could very
old visualizations. We really wanted a nice user interface and
experience for people. So that was the real first insight,
and then we had a long and winding path to
get to where we are to deliver a solution for that.

Speaker 3 (04:30):
So when was lipmaps actually founded.

Speaker 2 (04:33):
Yeah, so it was twenty sixteen, and there's been a
nights and Weekends project for a long time, and then
twenty twenty twenty twenty one is where we went full time.

Speaker 1 (04:40):
Yeah, so actually that's coming up on almost ten years ago,
so you know, Web of Science and these other databases
and platforms researchers would have been using heavily. Then things
have evolved a lot. Artificial intelligence and generative AI have
come into the picture. But at the heart of all
lip maps is this concept of connectedness. Maybe explain that

(05:03):
and getting a visual representation of what research exists in
the world.

Speaker 2 (05:08):
Yeah, so obviously we really want to help people understand
what's already been done so that they can contribute to
something novel. And so we're sort of coming at this
from the way papers cite each other and reference each
other essentially is really important. So if you're referencing a
certain domain or a certain topic, that sort of is
telling a lot about what you're looking into, and so

(05:29):
we leverage those citation patterns, and not only direct citations,
is also the sort of who's also citing those types
of things, and then so we can sort.

Speaker 3 (05:37):
Of create this connectedness idea.

Speaker 2 (05:39):
Of these papers are very similar because they cite the
literature in a similar way to you, and so that's
the core of our system. Plus we've shown that in
a visual way. So there's a timeline of publications with
citation coal on the vertical axis, and you can sort
of get a sense of the citation tree, if you will,
of historical papers to modern day and how they connect.

Speaker 1 (05:59):
So you obviously a problem there that this was very
clunky the systems out there.

Speaker 3 (06:05):
But it's almost deeper than that.

Speaker 1 (06:07):
Is that the way that researchers sort of look for patterns,
and that is sort of changing as well. It's a
lot more visual and I've seen the lit maps interface,
you know, and it is very much here's a pocket
of research on a very specific topic, and here are
some more over there. So is it also you know,
researchers how they approach doing scientific research and looking for

(06:30):
all the papers that are going to be useful to
that journey of discovery has changed as well and evolved.

Speaker 3 (06:36):
Yeah, I think you know, some things have said the same.

Speaker 2 (06:38):
So you still want to be able to tell the
story of I've covered this topic well and thoroughly, and
I understand what an expert would say are the key
papers that are in your field?

Speaker 3 (06:47):
So that is very constant.

Speaker 2 (06:49):
But obviously the way that people are traversing this and
making this job easier has come.

Speaker 3 (06:53):
About with tools like ourselves, And I think we've also helped.

Speaker 2 (06:55):
Graduate students a lot because they're starting out and they
don't really have that mental pith sure of who are
the important people.

Speaker 3 (07:02):
And topics in their field.

Speaker 2 (07:04):
So we're essentially giving them a bootstrap to get that
mental model and maybe a professional senior research already has
that and their brain already.

Speaker 1 (07:11):
Yeah, So generative AI. I've been using it for deep research. Literally,
with the likes of Perplexity, you can get clawed to
do some incredible research on your behalf to identify what
the main papers out there are and then summarize them

(07:31):
do a lot of the work for you. So some
of your users going through a similar sort of process here,
they're sort of short circuiting that research process by making
it a lot easier just to find exactly what's out
there they can draw on.

Speaker 2 (07:47):
Yeah, so definitely our tool helps to get from zero
to one very easily. I think we're not obviously doing
the full literature review for you. So some of those
other tools like deep research from open AI is sort
of you put in a few queries and get this
report that's sort of maybe at a PhD level back
to you, but I think we're sort of in that
helping to educate the graduate student to get to the

(08:10):
point where they could produce that report. So there's an
understanding that maybe the citations that are in there, you
want to check them out and understand what the quality
level that is.

Speaker 3 (08:19):
And the AI tools are very good these.

Speaker 2 (08:21):
Days, and I think people have to reassess maybe monthly
how good these tools are because they're evolving very quickly.
But I think there's a combination of there's a generative
AI tool that can do the work or some part
of the work, but also we want to be able
to train people to produce that. So we see ourselves
as merging both areas where we provide these diagrams to

(08:41):
help people understand the literature landscape easily, but also we
feed into these sorts of tools with a open.

Speaker 3 (08:47):
A plugin or maybe AI eventually uses a tool.

Speaker 2 (08:50):
Like ours to navigate the citation landscape.

Speaker 1 (08:53):
Right, So in terms of your approach to AI, you're
literally drawing on some of those big large language models
likes of open AI and claud.

Speaker 2 (09:02):
Yeah, so that we were this interesting hybrid of there's
the metadata and metrics that are very tried and true,
you know, how papers are evaluated, and then there's a
new generative I tools that'd say, you know, maybe they
can represent what a paper means and some abstract sense.
And so we're using those two different approaches to really
hopefully provide the best in class search. And that means

(09:25):
not only you're getting good results, but you can see
how they connect to what you know, and that's really
where our map visual diagram helps a lot.

Speaker 1 (09:31):
And I guess that allows a researcher to to use
more sort of conversational AI type searches and prompts like
we do every day with chatchipt or perplexity, where you
can literally say, hey, you know, we're all the best
papers on breast cancer, for instance, and it will it
will go out there and look for them and then
represent them visually for you.

Speaker 3 (09:53):
Yeah. I think that's a good point.

Speaker 2 (09:54):
So there's a you know, a way that we can
tie you know, maybe you want to say, what are
the key themes in these papers that I be recommended
and the generati way I tool will help pull that
out for you. An example of this as well is
just the time speed up. So We've got all the
pedic surgeon in Spain and he looks at, you know,
one of the latest techniques to heal this broken bone,
or the surgery technique, and so sometimes on the day

(10:16):
of surgery he's looking that up before he goes into
the theater because it's a lot easier to run that
search than it used to be.

Speaker 3 (10:21):
So that's some of the impact we're having.

Speaker 1 (10:23):
So not only at the early stage research where someone's
formulating a hypothesis is going to set out on three
years and more of research to come up with new knowledge.
You're actually getting surgeons, people on the front line who
are using this as a learning tool before they potentially
go into surgery.

Speaker 2 (10:42):
Yeah, and you know, that's a really great impact, and
that's one of our mission values is to accelerate impactful science.
We want to see you know, those outcomes, and that's
kind of what gets us out of bed most days.
But then there's also the other there's tons of other
use cases. We see people, you know, making investment decisions,
so they're looking at evaluating pharmaceuticals and saying, you know,
what colinical trials have been run and what's out there.

(11:04):
So we're sort of having impacts across many different demands.

Speaker 1 (11:08):
Yeah, so I guess part of the scientific process is
knowing what's there, what people have done, and it's all
about building on the shoulders of giants, the people who
came before and did some of those seminal papers that
are highly cited. But it's also really about the gaps
as well. What's the novel bit where I can really

(11:28):
add value and create unique knowledge?

Speaker 2 (11:32):
So is it good for identifying those gaps? Yeah, that's
a really hard topic. Obviously, we're not saying we solve
that problem with research gaps. I think we help a
lot to get to that point where you can at
least know what other papers are out there that on
topic for you. But obviously, yeah, there's a lot of reading,
a lot of thinking, and it is really hard to
identify and contribute to a gap. Sometimes there's so much work, Alrea,

(11:55):
you've been done in the field that it's hard to
find a novel PC to contribute to. So yes, definitely
help and have people tell us that we've helped with that,
but I know it's a really difficult thing to work on.

Speaker 1 (12:05):
Yeah, and you've got two million users now, some of
them at Harvard, Stanford, Cambridge, all the big universities, you've
got researchers using them. At what point did it really
start to take off where you started to get hundreds
of thousands of people actually using lip maps And what
was the spark that really triggered that early success.

Speaker 2 (12:27):
Yeah, so a lot of that came from feedback iteration.
So we shared the tool with our friends at Faculty
of Medical Health Science at University Walkan for example, got
mixed feedback and considerate from there. But then you know
online forums where people post preprints, for example, so archive
dot org we have a lot of the leading AI
papers or physics papers are published, so there's an integration

(12:50):
there that's really helped spread the word about lipmaps.

Speaker 3 (12:53):
So it was sort of on a paper.

Speaker 2 (12:54):
You can create a lipmap from this paper, and that's
really helped with growth. And then obviously I think just
having universities recommend us. So you know there's a website
on most libraries that have you know, how to search literature,
and there's a section about litmaps often which has really helped.

Speaker 1 (13:09):
Yeah, And as I find with AI on a regular basis,
it hallucinates and it gives you weird results, I guess
because this is based on it's drawing on verified information sources,
including pre prints some of these databases where it hasn't
been through the full peer of view process, but you

(13:31):
can actually see the entire paper before it goes into
a big scientific journal. So I guess you're sort of
eliminating or listening the chance of that through drawing on
all of these established databases of verified information.

Speaker 3 (13:45):
Yeah, that's been an active approach from us.

Speaker 2 (13:47):
So we taken metadata from papers and that's what we
really recommend from. So there's not really risk of us
saying we're making up a citation, and we do validate
results where we work with large language models or the
way we work as well as really important to make
sure that things are not just made up. There's actually
a tie back to how we've recommended it, and you
can see what that means. Plus you can also filter

(14:10):
out things like preprints if you're interested in just the
published peer review papers as well.

Speaker 3 (14:14):
Yeah.

Speaker 1 (14:15):
Research Rabbit cool name that like task rabbit, but for research?
Were they on your radar and how did this acquisition
come about?

Speaker 2 (14:25):
Yeah, so we obviously have known them in the industry.
I guess it's a small world, and you know, of
people in their network. I think an interesting opportunity came
up when they were looking to sell, and so we
basically said, you know, this is a great way for
two very similar tools to come together and merge and
serve very common user base. So it's basically a story
of two independently successful tools coming together and serving a

(14:48):
very similar market.

Speaker 3 (14:49):
So we're quite excited about providing a really good.

Speaker 2 (14:52):
Service for both both user bases and over time creating
a tool that serves both people, both groups of people.

Speaker 1 (14:58):
So will you carry on research rabbits platform and do
you plan to merge them at some point?

Speaker 3 (15:04):
Yeah, so we're still working through those plans.

Speaker 2 (15:05):
Essentially, what we ended up want to end up with
is a tool that serves both user groups really well.
And that's the graduate student today. So whether that's a
research a product or that's a mass product or some
sort of merged brand that we'll see, but the end
of the day, it will be the same value offering
that is on the market today.

Speaker 1 (15:26):
Great, and a million dollars obviously that's a decent amount
of money. Is that sort of I guess seed funding
you'd described that as.

Speaker 3 (15:36):
Yeah, So we've had various levels of funding.

Speaker 2 (15:39):
I think you know, classing these rounds, it's yeah, it's
basically another seed round for us to get us to
the next level of growth and get into a Series
A or something like this.

Speaker 1 (15:46):
Yeah, and obviously the real opportunity is the international research community.
I guess that's when ninety nine percent of your research
bases do you have local users?

Speaker 2 (15:56):
Yeah, so about eight percent of our paid user bases
and New Zealand, so small contingent, but it's good to have.
So obviously we're now based in Victoria University of Wellington
and being around that is awesome to have hands on you.

Speaker 3 (16:09):
Face to face time with users.

Speaker 2 (16:11):
But yeah, we obviously it's spending a lot of that
seed capital on growth and expanding and making sure we're
at the right conferences and in front of the right people.

Speaker 3 (16:20):
So actually take on artificial intelligence and.

Speaker 1 (16:24):
Its impact, particularly since we've had generative AI come along
on how research has done. Obviously, there are applications for
the science itself. Volpara Willingtson company using AI and other
tools to look at mammograms to try and identify, for instance,

(16:45):
tumors or strange looking growths more effectively than a human
can do. That manually that's been a very successful business.
There are lots of other applications of AI, material science
and the like, but in terms of how researchers actually
set out on that path and organize their thoughts and

(17:06):
come up with the papers that are going to inform
what they're potentially going to spend a lot of money
on doing lab trials and the like.

Speaker 3 (17:15):
How is it changing, how's it evolving, and where is
AI really going to take that process in the next
few years. Yeah, I think there's a big spectrum impact.

Speaker 2 (17:22):
So I think on the one hand, there's you know,
things like alpha fold three that have really helped move
a particular domain forward because it can understand how we
go from sequence to structure a lot better than any
other previous.

Speaker 3 (17:35):
Approach to that.

Speaker 1 (17:36):
That's like protein folding, Yeah, things hardcore basic level science.

Speaker 2 (17:41):
Yeah, which is a very particular domain and they've had
really great benchmarks, so you know how well the tool's doing,
which I think is really important because otherwise it's hard
to evaluate is.

Speaker 3 (17:49):
It solving things in a way.

Speaker 2 (17:51):
But there's other approaches where you know, maybe if AI
solves all problems for you, the people who are in
training don't really have to push them and undergo their
education process where they test their ideas and battle tests
and learn from their supervisor and get that sort of
critical thinking element that might be taken away if AI
does all their job for them. So there's this interesting

(18:13):
balance of there's really great impact and tools that are
available now, but there's also that risk and that balance
of does this tool produce results that I can trust?
And also is it sort of shortcutting the process that
has maybe required to get that critical thinking and important
development happening.

Speaker 1 (18:32):
It's a really good point because the whole idea of
being a good scientist a good researcher is that curiosity
and that ability to go out and discover stuff for
yourself and not all just to be handed to you
on a plate.

Speaker 3 (18:48):
So how do you deal with that in the MAHAPS context.

Speaker 2 (18:51):
Yeah, and I think for us, you know, we really
feel it's human plus AI. So empowering people with the
right tools is really our approach. And obviously you could
argue some of these other tools al really do that,
but I think our particular approach is really you know,
whether it's explaining results, so the diagram to help you
see our things put together or just our approach to
you know, making sure that we don't have open ended

(19:12):
sort of AI chat. It's something where we can validate
results and show you know, this is time back to
literature in this way that's really important for us. So
that I guess it's the way we're going about it
to empower people. And as you mentioned, you know, if
there's R and D teams at Volpara or other companies
using these types of tools in their processes, that's really
where we want to help have an impact as well

(19:32):
beyond academia.

Speaker 1 (19:34):
Yeah, and it's just got to be huge efficiency across
the board. Everything from writing research proposals and funding applications
that takes up.

Speaker 3 (19:42):
A huge amount of time.

Speaker 1 (19:44):
If you can use AI and that process through to
writing up the results, you know, it's a months long,
years long process. Sometimes a few submittings something to nature
or science.

Speaker 3 (19:54):
You have big journals. There's a lot of work required here.

Speaker 1 (19:58):
So I guess you know, when the politician say to us,
we're funding AI because fundamentally it's going to help on
things like healthcare, sustainability and the climate. They're looking at
all of those efficiencies that are gained that shorten the
amount of time it's going to take.

Speaker 3 (20:16):
For a research team to get something to fruition.

Speaker 2 (20:19):
Yeah, I think there's definitely and you can't deny there's
an impact there of saving time in lots of ways,
and we think we are contributing as well.

Speaker 3 (20:27):
Yeah.

Speaker 2 (20:27):
I guess it's balancing that with you know, making sure
that you are still investing time and training up the
next generation and having enough funding to have the programs
be run.

Speaker 1 (20:38):
How well received is AI buy, you know, the Stanford
and Harvard and MIT scientists, I'm sure by that AI
scientists themselves they love it.

Speaker 3 (20:48):
But is there a little bit.

Speaker 1 (20:48):
Of pushback in skepticism from scientists around AI?

Speaker 3 (20:52):
Yeah. I think again there's different levels.

Speaker 2 (20:55):
So people in the teaching realm, I think there's maybe
an epidemic of cheating, and how do you handle every
assignment being submitted and generated.

Speaker 3 (21:03):
By CHATGBT maybe, and so.

Speaker 2 (21:05):
Evaluating and dealing with that is maybe a really difficult
battle that's going on.

Speaker 3 (21:08):
And then there's the you know, publish or perish.

Speaker 2 (21:11):
Type approach where you know you're trying to produce these
results and get in really high quality journals. And so
maybe the AI helping write has increased the temperature of
competition and pace of publishing there. So I think it's
really unlocking some new problems. And you know, as I
mentioned in protein folding and other domains, it's really allowing

(21:32):
people to go and solve problems they wanted to solve before.
This was a roadblock in terms of solving a sequence
to structure. So I think it's unlocking really great opportunities
but also causing headaches and other areas that yeah, maybe
we should assess students differently and avoid some of these issues.
But yeah, it's definitely causing headaches I think as well, and.

Speaker 1 (21:52):
I guess forget the next phase and probably some researchers
are already doing it to some extent. Is you go
to lip maps, you get to identify, we're all the
really good research papers, the ones you hadn't necessarily thought off.
You download all of those, then you put them into
a large language model and create a literature review or

(22:12):
at least a first cut off a paper, and in
stat refining it from there. So that's another way that
it could speed up massively the process of doing science.

Speaker 3 (22:22):
Yeah, so there's definitely that knowledge work.

Speaker 2 (22:24):
We want people to be probably running those experiments rather
than hopefully not wasting time with all the sort of
admin and behind the scenes stuff. So yeah, I definitely
think if we can shortcut some of that then that's
going to help with pushing science forward.

Speaker 1 (22:39):
So what sort of operation do you have we're here
in Wellington at Victoria University.

Speaker 3 (22:44):
Do you have a team of coders here? Have you
done that in house? Yeah, that's right.

Speaker 2 (22:48):
So we've taken a really full stack approach, so we've
done everything from the software development piece, the design piece,
and even the marketing and getting the word out.

Speaker 3 (22:56):
So we're very full stack in that way.

Speaker 2 (22:58):
And so yeah, we're great to be based in the
university because we can also just you know, get students
and pay them to do user testing.

Speaker 3 (23:05):
It's really awesome to.

Speaker 2 (23:06):
Have that direct access to who we are serving. So yeah,
really proud to build the product here and export it.
I think that's something also proud of is you know,
we're not just doing some of this and exploring some
of the jobs.

Speaker 3 (23:21):
Maybe I'm also proud of.

Speaker 1 (23:22):
That, and you're probably like a number of New Zealand
software startups that had a really interesting novel, Idea got
a bit of traction internationally. Then the AI wave came
and then you had to figure out how do we
integrate this into our core platform. What's that been like
getting your head around these large language models, how the

(23:43):
interface between your business and the likes to open AI
and these other companies works doing stuff in the cloud,
you know, where at least you're not, you know, not
having to build a large language model, which would be
vastly expensive, but there are costs, added costs and licensing costs.
What's it been like getting your head around that and

(24:04):
even having to at a fundamental code and develop for
an AI centric product.

Speaker 2 (24:10):
Yeah, I think a lot of that's come from even
though we were a bit before AI was mainstream. I
think there was GPT two when we're starting and some
of the early versions, so we're really aware of what
was going on.

Speaker 3 (24:20):
So I guess because.

Speaker 2 (24:21):
We're the full stack and do some of the deeper
tech work, it means that we can really integrate with
these other tools at a very low level and take
advantage of them from running our own models and having
GPUs that do inference and not the training part, but
actually run it at time of getting data out of it.
But that has changed the landscape a lot, so what
people expect as well has been raised, so users expect

(24:44):
a sort of a gener AI tool most of the time.
But I think you have to do something that's unique.
You can't just recreate chat GPT for science. I think
you have to contribute your own thing. And that's really
where the visual diagrams help a lot. And that's sort
of our point of view, is that human plus AI,
as I mentioned, to leverage some of the technology and
a great user interface and deliver from the customer.

Speaker 1 (25:05):
The issue of bias is talked about a lot in
relation to AI. That's really why governments are reluctant to
use it for decision making. We've seen some examples of,
for instance, the courts in the US being biased against
black people in decision making in courts, so that was
a backlash quite rightly so against that. I guess because

(25:29):
you are you're citing established literature removes some of the
scope for bias here. But I guess there is bias
within the scientific process as well. Beyond the impact factor
of research, there are other things that, just as human bias,
we will gravitate towards certain types of information.

Speaker 3 (25:52):
Is that something you have to deal with, Yeah, I
think to a lesser extent.

Speaker 2 (25:55):
I think, yes, there's obviously the occurring of citations of
something that's highly started, it gets more citations, and so
there's a sort of a Matthew's effect there. But yeah,
I guess from our point of view, we want to
empower the user. So if they have a way of
looking at the literature that's important to them, they can
put in their topics and then we help triarge the
results by that. And obviously there might be bias and

(26:17):
the way that the model is then sorting those results
into those triarched topics for you. But I think it's
a really being transparent and exposing how we do things
is helpful so that you can see the bias at
least and have an understanding of what that means. I think, yeah,
inherently there's going to be issues like that to deal with,

(26:39):
but I think if we're transparent and we give user agency,
they can have impact there.

Speaker 3 (26:44):
Yeah. So what's the next steps for maps.

Speaker 1 (26:47):
You've obviously got this business that you'll be integrating research
raviity picking.

Speaker 3 (26:51):
Up a team of people with that.

Speaker 2 (26:52):
Yeah, so part of the capital was to help scale
the team, and obviously we've had our own growth to
one mile of rr A run rate, and that meant,
you know, we need more team members to deal with that.
So there's yeah, that effort has really enabled us to
scale up and address the opportunity ahead of us.

Speaker 1 (27:10):
Yes, that's a million annual recurring revenue they call it
in the in the software startup world.

Speaker 3 (27:16):
That's great. What's your business model?

Speaker 1 (27:19):
How do people there's a premium tier and a free tier,
like a lot of these.

Speaker 3 (27:23):
Sorts of services.

Speaker 2 (27:23):
Yeah, that's right. So we do think that, you know,
we want to deliver a quality service. So essentially as
a freemium model, so you can get a lot of
value for free, you can throw in a few papers
that you know and quickly get recommendations. But once you
really dive into a specific topic ERAa you want you know,
maybe uploading thousands of papers and seeing recommendations from that,
that's when you subscribe and pay for a pro tier

(27:46):
of our tool. So it's that software as a service
model where there's a large free user base and then
a smaller percentage pain And it's interesting, you know that
in this sort of scientific literature space. We've had another
very successful in New Zealand, started at Publands, came again
out of Wellington out of Creative HQ. The accelerator here

(28:08):
in town also got to I think over a million
users before it was acquired, So it's interesting. It's too
in a similar field they were doing slightly different. That
was more about I think research reviews and getting visibility
into those. But what does that say do you think
about about how we come up with successful ideas in

(28:29):
this whole space? In New Zealand, Sir Paul Callahan famously said,
look for the lucrative niches that no one is doing
well around the world. Is that the case for you?
Do you think that you looked at what was happening
in the US and Europe, the big bastions of academic
literature and research, and you just weren't seeing anything like it. Yeah,

(28:51):
so we definitely thought we had a unique contribution to make,
and we were lucky enough to be connected with the
founders behind Publons and that really helped us as well.
I think they blazed a trail for us to see
that there is some value here, and you know, some
investors were on that journey so they could see, oh,
there's another possibility here. So that was really awesome. And
I think because this is a real particular niche. I

(29:13):
think having a domain expert essentially and the doorstep has
been really helpful. And obviously and as well, Andrew from
Pavlon's led this round with his angel group Great Up
in the UK, so there is almost a guess the
next generation of that sort of startup and that theme coming.

Speaker 3 (29:31):
Through as well. So I think it's very personal.

Speaker 2 (29:33):
There's a people element, but also I think there's a
lot of science and I guess expertise in Wellington. That's
meant you know, these types of things may happen in
the future as well.

Speaker 3 (29:45):
But yeah, that's what my take on.

Speaker 1 (29:47):
I'm glad to hear that about Andrew investing because that's
the model that is really sustaining and growing the impact
of our startup ecosystem is founders reinvest in some of
the proces into their sales, into the next generation of startups.

Speaker 3 (30:03):
So that's and the expertise as well.

Speaker 1 (30:06):
They know they've seen the opportunities and they're looking back
in New Zealand going there's great ideas here.

Speaker 3 (30:11):
I want to help foster them and take them global
as well.

Speaker 2 (30:14):
Yeah, and it's really great obviously to have some of
that experience and you know he's worked for weather Science
that claravate and run product there. So that's also a
massive thing to have someone who's not only how to
start up an exodent but also became part of that
big bastion tool that everyone knows and helping to feed
back into the next generation of companies as time goes on,

(30:36):
and he's got that experience behind him.

Speaker 3 (30:38):
Excellent.

Speaker 1 (30:39):
Well, Accident a great business and one it has a
great mission helping researchers accelerate the understanding of knowledge and
creation of new knowledge. So good luck as you bring
research Rabbit into the fold and begin to build your
revenue into your userbase worldwide.

Speaker 3 (30:57):
Awesome, thanks so much. Thank you.

Speaker 1 (31:05):
That was Accident co founder off Litmaps sharing the remarkable
story behind what I think is one of New Zealand's
most exciting emerging sort.

Speaker 3 (31:15):
Of text stories.

Speaker 1 (31:16):
It's really an example I think of the sort of
start up the late great physicist Sir Paul Callahan encouraged
New Zealand to produce one that focuses on lucrative international
niches that aren't really being well served by bigger markets.
Scientific literature, turns out, is one of them. Publons which
was founded by Andrew Preston in Wellington in twenty twelve,

(31:41):
allowed academics to track, verify, and showcase their peer review
and editorial contributions to academic journals. It was bought in
twenty seventeen by Claravate, the New York Stock Exchange listed
company that owns a Web of Science, a massive collection
of scientific literature atabases. That was a multimillion dollar deal

(32:02):
undisclosed exactly how much at the time. So great to
see Andrew involved in lip Maps two as an investor
and a mentor. So thanks to Axton for coming on.
Thanks to you for listening. Head over to Business Tesk
dot co dot nz to access the show notes in
the podcast section there, including my top ten list of

(32:24):
text stories to read this week from around the web.
Don't forget to subscribe to the podcast via your favorite
podcasting app or on iHeartRadio. Leave a review or rating two.
If you like the show and get in touch with
your feedback and guest suggestions, just email me on Peter
Atpetergriffin dot co dot nz or connect with me on LinkedIn.

(32:46):
I'll catch you next week for our one hundredth episode
of the Business of Tech.

Speaker 3 (32:51):
See you then
Advertise With Us

Popular Podcasts

Stuff You Should Know
The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

True Crime Tonight

True Crime Tonight

If you eat, sleep, and breathe true crime, TRUE CRIME TONIGHT is serving up your nightly fix. Five nights a week, KT STUDIOS & iHEART RADIO invite listeners to pull up a seat for an unfiltered look at the biggest cases making headlines, celebrity scandals, and the trials everyone is watching. With a mix of expert analysis, hot takes, and listener call-ins, TRUE CRIME TONIGHT goes beyond the headlines to uncover the twists, turns, and unanswered questions that keep us all obsessed—because, at TRUE CRIME TONIGHT, there’s a seat for everyone. Whether breaking down crime scene forensics, scrutinizing serial killers, or debating the most binge-worthy true crime docs, True Crime Tonight is the fresh, fast-paced, and slightly addictive home for true crime lovers.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.