Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Bloomberg Audio Studios, Podcasts, Radio News.
Speaker 2 (00:18):
Hello and welcome to another episode of the Odd Locks podcast.
Speaker 3 (00:22):
I'm Jill Wisenthalm and I'm Tracy Alloway.
Speaker 2 (00:25):
Tracy recording this January eighth. We ran a good piece
in the newsletter this week from Sconda about AI spend
is like a sort of like meaningful macro driver or
getting close to where it starts to move the dial.
Speaker 3 (00:38):
Yeah, that was a really good piece, and I have
to say some of it is slightly worrying. But I
think one of the big things that's happening now is
okay ai has become such a big pillar of the market.
Y right, like the entire S and P five hundred basics.
Speaker 2 (00:54):
It's like an AI play.
Speaker 3 (00:55):
Yeah, it is an AI play. And so at some
point the hype passed to be match by reality, i e.
All that investment has to be matched by some sort
of revenue, right, you have to get money out of
making this investment.
Speaker 2 (01:09):
I think there's three ways things could go. I've been
thinking about this. There's three ways this could play out.
One is companies don't get a lot of productivity gains
from these tools, they cut back spending. There's a bunch
of other projects get delayed, market goes down. Another possibility
is there's this great productivity breakthrough, companies are more efficient
(01:30):
than ever, incredible boom. That's great. And then the other
possibility is that none of this matter is and they
build God at one of these labs, and then everything
we know about economics doesn't even make any sense. So
to even talk about productivity or the S and P
five hundred or earnings in that regime is just like
it's like a secondary concern to like the way the
world has changed when they achieve AGI.
Speaker 3 (01:52):
Joe, are you okay?
Speaker 1 (01:53):
No?
Speaker 2 (01:53):
I think those those are like the three Yeah.
Speaker 3 (01:55):
You might as well go big with your scenario analysis. Yeah,
but you're right. It could go either way. And obviously
there's a lot of talk and nervousness about a bubble
in AI at the moment, So I think it would
be a good idea to maybe try to get an
understanding of how much companies are actually spending and benefiting
from AI.
Speaker 2 (02:15):
That's right, and we have the perfect guest because he
has a great view into what companies are spending money on,
including AI, and also as the CEO of a company
himself in the tech space, but not directly in the
AI space, the sort of user of AI tools, and
maybe could talk about what's being used, what's not being used,
where productivity gains are being had. We're going to talk
(02:37):
about all of this stuff. We're gonna be speaking with
Eric Glyman. He is the founder and CEO of RAMP,
which is a New York City based company. It's a
spending management platform for companies help them deal with expenses.
We might also talk a little bit about expense management
platforms because I have I have headaches about dealing with expenses.
Speaker 4 (02:55):
I think a lot of people do.
Speaker 2 (02:57):
Yeah, I guess that's why there's a business, a new
business to be made. Eric, thank you so much for
coming on Odd Laws.
Speaker 4 (03:02):
Joe Tracy, thanks so much for having me. It's great
to be here.
Speaker 2 (03:05):
Real quickly, what do you describe a little bit? What's Ramp?
What's your story here?
Speaker 4 (03:10):
Well, you can think about RAMP as a financial operations platform.
It's a single place where companies can issue cards, make
payments of all kinds, and even automate both expense management,
which we'll get to, and accounting. But the ethos of
the company and why we exist is actually to help
companies save time and money. We're the only company in
our space that actually reports back to our customers how
(03:31):
much money and time did we save you Over the
past four years. We've saved our customers over two billion
dollars twenty million hours, and half of that has actually
come in the past year. And so we serve thirty
thousand plus companies from small, medium to publicly traded.
Speaker 2 (03:45):
Got it.
Speaker 3 (03:46):
What was the gap in the market that you saw,
Because on the one hand, as Joe just laid out,
a lot of people hate expenses and they're clunky and
very bureaucratic and they take forever to do. But on
the other hand, this is a space that is dominated
by some very powerful legacy players, right I'm thinking about
American Express, for instance, and you're basically going up against them.
Speaker 4 (04:10):
These are companies that are great in their own right.
But I think we're built for a bit of a
different era. In the company you mentioned, the founders quite
literally wore top hats, you know, and are you know,
thinking not so much about I think really the needs
of the twenty twenties. Where I think luxury in the
twenty twenties is actually having an hour to yourself at
the end of a long workweek versus you have expense
(04:31):
reports to do at the end, and so we thought
the gap was a few folds. First, could you actually
infuse technology not to make an easier to use expense
report that only took an hour instead of two hours,
but actually an expense report that does itself, books that
keep themselves, and so the difference was we saw an
opportunity to create a card where you can tap it
make a purchase. We pulled a receipt from the merchant
(04:52):
or your email automatically, so your expense report is done
for you, your books and records are done for you,
and more. For business owners, we found the strange distortion
where they were trying to market products like spend more money,
earn more points. But every business owner I ever met
actually wanted to spend less and be more profit. So
we just try to keep it simple and build a
company just on those principles.
Speaker 2 (05:14):
I mentioned in the intro. Because you have this expense
management platform, you have some insight into what companies are
spending money on these days on AI, Like what can
you see you know, what are you able to see
about AI spend within large corporations?
Speaker 4 (05:30):
I mean it's real It is dramatically increasing, but in
actually interesting ways. And to give you a sense of
the panel data that we're looking at to get these insights,
we see over fifty billion a year in spend buy companies.
Some of these are publicly traded, most of these are private,
and often these tend to be on the bleeding edge.
So these can be from AI research labs themselves to
(05:51):
farms to nonprofits, mom and pop shops. And this is
across both credit card data as well as bill payment data,
so it's a pretty good subset of it. And what
we've seen is maybe twofold. First, just in terms of
raw and aggregate numbers, an average customer on Ramp from
the start of twenty twenty three to the end was
spending about four times the number of raw dollars on
(06:13):
AI based products, and so there's real budget that's starting
to go to this increased ways. And next the products
themselves are starting to actually go from experimental to operations.
Speaker 2 (06:25):
How can you see that? How do you what do
you see? What do you see? Your data that bags
that up.
Speaker 4 (06:29):
So the best way we think to know this is
that if you looked at an average AI purchase, maybe
you purchased some software seat In twenty twenty two, there
was a fifty percent chance that within the next month
a customer that bought it would no longer be a
customer they were experimenting with it. In twenty twenty three
that had jumped to a seventy percent chance. In twenty
(06:50):
twenty four, it's continuing to go higher, and we'll release data.
Speaker 2 (06:53):
Wait, sorry, fifty percent change, You continued at the next
that's okay, got it, got it.
Speaker 4 (06:57):
And so there was a radically higher chance that you
were keeping this around. And so it went from tinkering
to this is starting to become a real part of
engineering processes, sales tools that teams were using to be
more productive, to even back office tools to manage accounting,
managed expenses. And so I think we're still on the
trajectory best in class of products are going to be
(07:19):
in the nineties of percent, but the jump was dramatic
in twenty twenty three and four.
Speaker 3 (07:24):
How granular does your data go? Like can you see
people spending on I don't know, a basic LM subscription
versus something else?
Speaker 4 (07:34):
Very much so. So the interesting part about what we
do is because we automate the expense report process, we
can see not just that a company spent on open ai,
but specifically was it an API call, was it a
chat GBT license? And so even among products, you're seeing
itemized and skew level data, and so you can start
(07:54):
to get really interesting insights of even in terms of
sub markets. One of the emergent themes that people are
talking about now wasn't twenty twenty three, there was only
one name in AI that mattered, and that was open Ai.
In twenty twenty four, suddenly twenty percent of developer market
share was going to Anthropic, which was I think at
three percent in data in twenty twenty three, And so
(08:15):
you can start to get very granular of how is
this even being used across which models are being called,
and so it's actually this interesting level of insight that
hasn't quite been seen in these markets.
Speaker 3 (08:26):
And then, sorry to focus so much on the data,
but how do you actually classify an AI use versus
something else? Because I imagine there's a lot of software, for instance,
out there now that incorporates some sort of AI component. Right,
it feels like the ven diagram of AI and basic
tech spending is kind of starting to come together.
Speaker 4 (08:47):
I think you're totally right, and there's a variety of
I would say it was an easier question in twenty
twenty three there was only a few strange companies calling
themselves AI. Now you see kind of AI washing of
companies that are not your love stock to pop. But
I would say that we tend to classify these based
off of kind of self identification of the companies. These
tend to be large language models labs. These tend to
(09:09):
be companies that are pure play AI products, maybe in
eleven labs. If you want to generate an AI digital voice,
a cognition or Devin, you can hire an AI developer
and these type of tools. I think you're right though
my sense, And if you talk to too many people
in the valley, they'll tell you there will be no
company that sells technology in five years that is in
an AI company. And we'll see how the jury goes there.
(09:29):
But it's this basis two things.
Speaker 2 (09:31):
First, as a statement, I've never coded in my life,
So I've made a goal for twenty twenty five to
like use AI to like build an app, And I
actually built a really rudimentary app, but it wasn't really
doing what I wanted it to do. I'm not going
to talk about what it is. It wasn't really doing
what I want to do. And then I like fixed
the code and I tried to re upload it and
I broke it. So I had an app for about
five minutes and then it died. But I'm gonna this
(09:53):
is like my like go, I really want to like
learn to use technology. I want to talk more though
about you know me, when I use quote use AI,
it's just meaning like going to chet OpenAI dot com,
just like the most rudimentary user interface. Talk to us
more about what you can see on the gap between
just someone subscribing to the website versus someone paying for
(10:15):
API calls, which I imagine is sort of a deeper
level of sophistication. Building these models into a workflow in somewhere.
Speaker 4 (10:22):
Oh for sure. And I actually think this is the
most interesting development that if this happened successfully, it unlocks
what many of the value are talking about of historically
technology with software as a service. Yea, you know, you
could sell seats to people and would do it. And
there's this growing idea of service as software where suddenly
(10:42):
there are workflows where if AI is not just a
window you chat into, you get a response. But actually
at every step of it becomes very very interesting where
you have kind of end to end products, videos being created,
books being done automatically for finance teams, even art kind
of getting created in And so what I would say
is the way you can see it mechanically is often
(11:04):
the type of license. Usually these are consumer licenses if
you're buying like a chat GPT pro subscription versus there
are let's say enterprise or even kind of developer plans
where at the end of a month you get an
invoice from one of these vendors and you see, okay,
there were this many calls to this endpoint, these many
tokens were ultimately used, and will the specific use we
(11:24):
aggregate anonymous and don't report down to that level. You
can start to see suddenly this is much more similar
to how maybe a company would call Amazon Web Services
or Microsoft Azure, and this is core compute for some
service that is reliant and the different things about these
graphs is done right, and what all of these companies
are betting on is that they're going to grow exponentially
(11:46):
and that it's going to be deeply embedded. There are
some distinctions. One thing that I think makes this market
very different and in some sense more vicious than anything
I've ever seen is usually there's this idea of lock in.
You coast all your clouds services on one provider and
you can't change from one cloud to another, only for
small use cases. But in AI there's this practice of
(12:07):
kind of multiplexing, and so what developers are often doing
and sort of why Anthropic came out of nowhere seemingly
in a way that wouldn't be possible, was people would
try some knowledge work or some response on multiple libraries
open source opening, I see which one's the best, and
the winning call starts getting more calls, and so things
are just the markets are moving faster.
Speaker 3 (12:43):
So the specific companies are anonymized, But can you see
stuff on a sectoral level.
Speaker 2 (12:49):
We both went to the same but that's what I
was thinking about.
Speaker 3 (12:51):
Like, for instance, can you see which industries seem to
be ahead in AI and which ones are perhaps lacking behind?
Speaker 4 (12:57):
We can, And I think that there's a few things
that were obvious and others that have started to jump
out in interesting and unexpected ways. The obvious ones of
certainly the earliest adopters are technologists like technology. These are
engineering offices, startups. A lot of folks doing training are
trying and adopting very very quickly in our nimble to
(13:20):
use this. The interesting thing there is that have actually
most surprised us maybe correlate to what you might see
in experience. A lot of newsrooms are actually using kind
of recording tools, so if you're having a call, notes
are taken for you. A lot of sales teams are
using this tools that listen to sales calls, take down
the notes, suggest the next steps, and start to go
into this. But you also see I would say historically,
(13:44):
if companies wanted to cut costs and sort of focus
on efficiency, their choice was higher lower cost labor. Often
in spaces where some of this can be if you
can get the rules right, things like healthcare, which tends
to be late adopters. The rates of use of increase
is actually starting to be much faster because when costs
and margins are very low, if you can start doing
(14:05):
network calls to do more work when there's few two people.
These are some interesting emerging industries, but it's still quite early.
I would say the big question that people have out
there about the valuations of these companies is still very present.
The relative increases dramatic four x year over year in
raw dollars. Spend per company is very very large. But
to start to pay back some of these large cathic cycles,
(14:27):
it's going to be early. And so the bet that
needs to happen from where we sit is AI can't
just be a product skew that people buy. But I
think to pay this back, you probably start to need
everyone to use it.
Speaker 3 (14:38):
Actually, this was going to be my other question. So
we're talking about AI spend, can you see the other
side of it, in the form of I guess savings
either by being more efficient or perhaps cutting jobs.
Speaker 4 (14:53):
I would say it's very interesting questions, and I would
say generally, as AI has taken place, unemployment has started
to come down, and so I think these are very
real questions for the long term. And in fact, I
actually think one of the biggest misses in twenty twenty
four for those promising AI was it was going to
be the year of the AI agent. Was whatever I pitched,
(15:14):
you'd have the AI CFO, the AI engineer, the full
kind of jobs. And I think that what you start
to see is you see slices of things happening. For example,
I hope it's not anyone's job in twenty twenty four
to do just expense reports. But actually an AI can
do your expense reports. It can kind of look at
your invoices, it can do kind of the lowest value tasks,
(15:36):
where that's starting to become a present thing in these tools.
And so I think in the short run, I actually find,
generally we hear from customers work is getting more interesting
in some sense when AI is being adopted. I think
long term there are real questions of might it actually
be able to take a workflow? And and I think
practically speaking, aif and has very limited context, it gets
(15:58):
a question, it can prompt out a re sponsor doesn't
see the rest of the knowledge work, but as it
starts being everywhere, it might be possible.
Speaker 2 (16:04):
I need that and and agentic AI for this app
that I'm building, because like what's really annoying is it'll
be because like I'll like write some code and like
Google collab, and then it'll be like I'll try to
push it to get hub and then I have to
like go find my token and I don't want to
do it, and then I'm like wait, where do I
find the token in GitHub so that I could put
it in here? And that's the stuff that I don't
(16:25):
want to I mean, I guess I have to do it,
but I need the AI to just like go find.
Speaker 3 (16:30):
Does that count as coding if the AI is doing
the ending, if you're literally just typing in design and
AC I have.
Speaker 2 (16:37):
Like, you know, I have some ideas. I'm like trying,
but it's like it's still a little bit annoying, like
all these different windows and everything. Okay, but speaking of
all that, let's talk about your what you're seeing in
your company. I imagine every engineer that I talked to
is like, yes, in twenty twenty four or twenty twenty five,
they have a window open with their AI and they
have their coding window, and it's improved their productivity. So, like,
(17:00):
I assume that's happening in your company, that a lot
of code is being written either directly or indirectly or
with assistance from AI. Where else besides engineering, what actually
are you spending money on in terms of AI resources?
Speaker 4 (17:16):
So the three big places, and you nailed it. Number
one is in engineering. It is one of the most
digital jobs. All code is digital and so in a
strange way, that is actually arguably maybe the first industry
that is closest to AI. Second is sales and Growth. RAMP,
beyond having a large amount of spend data, is one
of the fastest growing companies or startups in history. And
(17:37):
part of what's allowed us to do this is the
average salesperson at RAMP is about four times as productive
as an ex closest competitor. A lot of that is
a heavy use of AI to automate aspects and lower
specific job.
Speaker 2 (17:48):
Let's spice so one of it makes a sales caller.
So what are they using AI for?
Speaker 4 (17:53):
So one of the most important functions in the first
rule for anyone going into sales is this role called
a sales development rep. And the job of that person
is book meetings, find people out there who maybe are
doing expense reports the old way, and let's bring them in.
And if you kind of decompose what that task actually is,
it's first asking who are these businesses? What is a
(18:15):
relevant moment? Have they just raised funds, have they just
hired someone on their finance team, Maybe they're posting an
open role in some needs. There are these signals out
there in the world then you notice, Okay, you've got
your lead lists assembled, you have to go write them.
Maybe you have a junior person just at a college
going and joined a Guess what's this person's email, what's
this person's phone number? How do I get a mail
(18:37):
or in front, or how do I knock on their door?
Then they write kind of the message, what's going to
be compelling. There's all these little steps. What makes a
human seller great and what makes sales interesting is the
genuine human connection, someone who can go deep, understand all
the context and actually close that great sale. And yet,
if you looked at the task in twenty twenty two,
(18:58):
how most people were spending their time with things that
algorithms are great at. Finding people's email address, testing which
copy will ultimately work better, detecting across vast swaths of data,
what's the signal in the noise, and so effectively. Part
of what's powering this level of growth is a broad
set of AI tools which do exactly that. Where AI
(19:20):
is finding the person's email, AI is detecting these signals
for intent sending the message. In the job of an
entry level salesperson now is the majority to respond to interest,
to close people on the call, and so that's one example.
We can go through through several kind of throughout the
sales cycle, but it's changed the role to be much
(19:41):
more interesting.
Speaker 3 (19:41):
I'm getting flashbacks to Glen Gary Glenn Ross. Didn't companies
used to buy lead lists as well?
Speaker 2 (19:48):
They still do?
Speaker 3 (19:50):
Yeah, yeah, right, So I imagine if you can basically build
your own lead generator, you would save some money as well.
Speaker 4 (19:56):
That's exactly right. But the interesting thing is just this
feed moves faster. There's suddenly signals of intents. Maybe there's
an IP that you can back into. This is a
company has gone to your site five times today. Maybe
that's a low value list, it's the C list, it's
not the A grade list. But with kind of a
modern stack that's sifting across these signals, you can get
(20:19):
more interesting and so I think there's things like that.
The other big thing that has just transformed the job
is there's just so much more noise than there is
signal out there in the world. So if you're from
manager and you're trying to help a twenty two year
old new in their career get better at sales, it's
just too many hours of calls to listen to across
your whole team to do that. The large language model
(20:40):
has no problem listening to a thousand years of calls
in a single day, more than any human can. And
so suddenly, I actually think one of the more interesting
stories and lessons I learned about this actually came from
one hundred plus year old credit card company, where I
was first skeptical when I heard the story. The executive
was explaining how they stopped checking net promoter scores, and
I'm like, wow, they must have gotten that bad. But
(21:02):
the truth was something more interesting promoter So a net
promoter score is this process of often you sample. So
let's say there's ten thousand people who call in with
customer support, and you asked them at the end of
the call from zeroitech, how happy are you with the service?
And often in these surveys you ask a small sample
and you can see an aggregate how good are your agents?
(21:23):
The response was much more interesting. What they started doing
was applying a large language model to listen to all
calls simultaneously, and they could apply sentiment analysis from the
tone of a voice how happy were customers or not.
And what the algorithm started doing was routing calls from
customers increasingly to people who did a great job that
(21:44):
made customers happy.
Speaker 3 (21:45):
It's always the case if you're good at your job,
you get more, that's the reward.
Speaker 4 (21:51):
But it's a real way suddenly, like you can actually
tell from every customer how are they doing, and actually
get more output per unit of input. It's a real
ease case. I don't know how you do a few
years ago.
Speaker 3 (22:01):
And I'm sorry, you were talking about where you're deploying
AI and you had engineering and sales, and we didn't
get to the third one.
Speaker 4 (22:10):
Oh sure, yeah, we'll see if it works. Maybe most
avant garde would actually be kind of in growth and marketing.
Of course, there's customer support in other areas that that
people have talked about. But I actually think there's a
real case that marketing and growth is becoming a technical function.
People thought that art would be one of the last
use cases, and suddenly you can create images, videos, interesting things,
(22:31):
And you know, now in a world where you can
go and start to do this, you can see based
on intent on conversion, can you start to combine the
mathematical function and what really works with people to creating beautiful, striking,
interesting images on demand and so the jury is still
most out with that one, but it's been real I
think early tests are promising. And I think that the
early example of this that to me in some ways
(22:52):
kind of inspired this was maybe twofold one Amazon. They
had kind of the editorial and the personalization team. It
used to be that folks at Amazon actually wrote and
recommended you bought you know, here's a newsletter of all
the things that we have, and it was like the
series Robucks catalog. Eventually it became you bought this, you
might be interested in that, and eventually things went out,
And so you may be able to do that not
(23:12):
just with clustering of items, but actually understanding people and
making beautiful things. And you know, I think of inspirations
in New York like Andy Warhol his factory, where every
day they'd make something new, things were incredibly striking and
had this method of new levels of art thinking about
being incredibly generative. And I think that every industry in
some sense can be improved and augmented and made more
(23:34):
interesting actually through kind of the just the leverage that
these tools can bring.
Speaker 2 (23:39):
I'm going to be cynical for a second, and I'm
going to press you a little bit further on the
question of AI and sales, because I certainly take your
point about searching for signal what companies might be in
a position suddenly where they're looking to upgrade their expense
management platform, or what might be the person who best
to call some of that, I imagine is stuff that someone
(24:01):
was selling via Salesforce for a while and calling it
five years ago machine learning or ten years ago machine learning. Obviously,
what is technically AI is one of these things that
people argue over to get a better multiple, et cetera.
But for the purposes of what many people in the
stock market are excited about a lot of it's the
sort of you know, the post twenty twenty two generative
(24:24):
AI that somehow comes in the lab and was inferred
on an in video chip or something like that. Say
more about sales and what are what other things that
you can do in sales today in twenty twenty five
that you could not have done in twenty twenty one.
Speaker 4 (24:39):
I think that probably the most exciting area has to
do with reasoning. When I think about a lot of
classic machine learning, which is still deeply important, it's often
around correlation of certain variables and prediction in a very
narrow sense. And so maybe the first phase is of
machine learning is prediction of what comes next? You know,
my frame around a lot of this is now there's
(25:01):
generation based on what connects? What do you create? And
I think the most interesting when you think about kind
of the one models, three and the new reasoning models,
where it's thinking and can think multiple steps ahead. Some
of the techniques that made Alpha go so good at
what it's doing, it's those types of work. And so
you're exactly right. Some of this is actually just good
data infrastructure and prediction. But when you start to fuse
(25:24):
that with based off of these signals, what maybe should
we write based off of the context of the calls
and the usage of this data, how do we follow
up and orchestrate it across multiple steps. That's where I
think generative AI starts to get really interesting in these
boring use cases like expense management and saving people time
and money, where it starts to get very useful.
Speaker 3 (26:00):
Can you talk a little bit more about what data
exactly you're scraping to infer those specific signals, like what
do you have access to and what do you find
most useful?
Speaker 4 (26:09):
Yeah, so, first some of this is just let's take
a use case of Let's say an account manager. They
might be overseeing hundreds of individual accounts at ramp and
their goal is to get back on the phone and
make sure people are getting value out of it. We
want to save your business time and money. We want
to made automate accounting. But is it set up that way?
Are you seeing these terms of use cases? And so
some of this is going to be internal use cases
(26:30):
based off of spend that you wanted to bring over
versus which you actually spent. How is that going? Log
data large language models can remember all calls, all notes,
what people kind of committed to you by, what dates
and do that, and so when an account manager gets
on the phone, they can go and have all the
right contact in front of them that they didn't need
to go and spend hours the night before. Kind of creating,
(26:51):
but it's pulled up and pulled in terms of what's
most useful. You next may have interactive data can pull
through based on the website itself where their flows where
it appeared well just got stuck and confused. Maybe they
wanted to go close the books and it seemed like
too many steps and they kind of paused, we can
kind of.
Speaker 3 (27:08):
So you're tracking like actual physical movements on the website
through beacons, I assume.
Speaker 4 (27:13):
Yeah, Yeah, so you can do things like that coupled
with even external data as well. Sometimes a company could
be doing really well announce the fundraise, you know, need
to expand and make sure hey we've seen this good news.
We want to make sure that we're expanding with you.
And so I think that any one individual point a
person can do. But often the fullness of what does
(27:33):
it take to really understand and be a partner with
how are one to thousands of employees at a single
individual customer, how are they actually doing so we can
be a more useful partner is often where this comes together.
Does that make sense?
Speaker 3 (27:45):
It does, although I have maybe this is a weird question,
but like how does the system actually generate its suggestions?
So if I'm a salesperson and something like the system
spots a lead of some sort and it thinks you
should get in touch with this person because of whatever reason,
how does that message actually get to me?
Speaker 4 (28:06):
Yeah, And I think that's exactly in some sense probably
the million dollar question of which SaaS companies are going
to do really well and I think even the story
of technology now is you have different people with different aspects.
Some are in the browser, some have this sales tool,
some have this data tool, some are a data warehouse,
some are a training tool. Where does it show up?
And I can tell you we leave it ultimately in
(28:28):
the hands of our sales team and the growth team
building for them. Of whatever works, you are free to
pick given the period of change, but for them often,
like I'll tell you, one of the more useful tools
for that use case for account managers is a company
called Rocks. I guess like rox dot com. They're less
than a year old company, but effectively are polling data
from Salesforce usage data, internal level data analytics, and appends
(28:52):
notes for an account manager's calendar prior to meeting. Here's
the core things to know, here's lengths and if you
want to pull, so you can get things out a
glance in your calendar, and you can actually pull from
the website itself. Here's more data to go see it.
So it's they're trying to become a bit of a
mission command for sales. Of course, Salesforce is trying to
do these things too. To in engineering, there's tools like
(29:15):
cursor and Devin have different bets. Cursors like get hub
Coat pilots sort of won the love of many developers.
Where as you're coding, it's like a better auto complete
that you just say I want to build this app
and it can go ahead. It can audit lines of
codes and knows your repo to. There's stranger bets like
Devin and cognition, where the form factor of that is
(29:36):
it is meant to be a digital AI engineer and
you can tell Devin, I want to build this app.
Go research these websites make it look like this. Come
back to me when you have questions I get from
my app. Try it.
Speaker 2 (29:49):
I just like I was trying to someone. I forget
who it was. Maybe there's even someone from one of
these companies. Never talk about like different approaches to some
of themself, including ones where like the AI basically controlled
the mouse, yeah, controlled the cursor, yes, and clicked on
websites and write websites. So we think of API calls,
but there's a different model where it's just like you're
(30:09):
scanning the website like a human.
Speaker 4 (30:11):
Is right, You are exactly right. And this is one
of the strangest and most interesting things about the time
that we live in, and that computers can kind of think,
they can kind of see, they can kind of hear
and process different levels of data. And I think that
the general's story of computing is increasing levels of abstraction.
Back eighty years ago people were writing machine codes, it
(30:32):
was one zero binary, and over time it went to
see the Python which were increasingly higher order compromises between
language U and I speech.
Speaker 2 (30:41):
Separate from binary like layers and layers of a binary.
Speaker 1 (30:45):
Yeah.
Speaker 4 (30:45):
And there are many people who would make the argument,
I think convincingly so that the next programming language, in
fact is English language. Yeah, and what you see? And
so I even think of now, what many people talk
about one of these coming battle rounds is actually what
are you seeing on your computer? And the best interface
is not chat, but actually it's just a large language
(31:07):
model that sees everything you're doing on your computer and
can predict what's next. Who knows it's nice.
Speaker 2 (31:12):
Well, when I keep getting error messages in my Google collab,
I just take screenshots and then I upload them to
chat GBT, and I say, what's this error message mean?
Speaker 3 (31:20):
Yeah?
Speaker 2 (31:21):
And then I teld me one thing I've wondered about
and you know, people are very anxious about what are
the jobs of the future and all these things and
what professions are going to get disrupted away, and you know,
we people could speculate on this forever. But one thing
I've like wondered about is like there are certain jobs
where to be good at them, you have to have
probably like sacrificed years in your life and like not
(31:43):
gone to parties and not head friends and you know
not because like the true artists, right, No, the technical
skilled required just so years and years and years, et cetera.
And so you had to be the type of person
that was willing to sacrifice a lot to get good
at them or to build up that technical skill right
those hours. And I'm wondering, like if AI is going
(32:04):
to sort of cut into the jobs, We're a major
part of getting good at it was this sort of
being willing to like sacrifice being a normal person because
the AI could just do thousands, doesn't need to sacrifice anything.
It's just a computer, and that it would benefit the people, like,
you know, the low rounded people who I think want
(32:24):
to have. Yeah, you bring some MiQ to work, but
you also bring some.
Speaker 3 (32:28):
Like emotional idea.
Speaker 2 (32:29):
Yeah, exactly. This is what I'm wondering about.
Speaker 4 (32:32):
I think this is exactly the right line of questions,
and I think is going to be a real one
to confront And I think in some sense, I think
even probably listeners of this podcast fall into this group
of like very curious people and people who know how
to ask interesting questions, keep going down it and create things.
I think we'll do very well in the future. But
you yeah, it's all gonna be okay, don't worry now.
(32:55):
But I think things are going to change a lot,
Like even in thinking about coming here today, like I
was curious at the turn of the century. In nineteen hundred,
I think in the United States, forty percent of all
jobs were in farming. You know, today less than one
percent of all jobs are in farming. And I think
things are okay currently, but I think that the nature
(33:15):
of job is going to change, probably in ways that
will be very hard for people, just as in nineteen
hundred to now to predict probably the same for us
and what it may look like in fifty one hundred years.
Speaker 3 (33:24):
How are you or your clients handling privacy and legal
slash copyright concerns with some of these platforms.
Speaker 4 (33:34):
Yeah, Well, first of all, like we have a bit
of a simpler time on at least copyright and that
we're not in the business of generation of how do
you go and create new art and images? And I
think those are real and present questions, especially for content generation.
A lot of what we're doing today is you've made
this card transaction, You've texted back thirty seconds later, here's
(33:54):
a photo of the receipt, and then we can match. Okay,
here's what the memo should be, Here's what the accounting
category should be, and you're done. And so a lot
of this is process automation and workflow automation, and there's
an interesting value we can bring and probably the most
interesting part of our business we need to think a
lot about this is on this area of price intelligence. Now,
as a consumer, you can go on to Zillo and
(34:16):
know here's what your home maybe is worth. You could
go on true car and get a sense of what
should you pay for this car based on lots of
aggregate and auontomized data. And for businesses it's very useful
to know what is this business paying down the street
for this supply or for this salesforce license, what are
people paying and so in that part of our business,
we actually do want to be able to go to
customers and say, you know, here's where you're pricing compares
(34:40):
to the rest of the market, and we want to
help you negotiate for lower prices. And we think it's
really good the way you get there. And I think
probably the core of our strategy around this is aggregated
anonymized is really the part that needs to come or
if it's individual data that really sits on ramp and
is for the purpose of providing service to you. And
if we share things broadly, it's give get. If you
(35:02):
want to see pricing data, you need to share in it.
But we need to have enough data to effectively anonymize
like what's coming out there. And so we think it's
very useful for finance teams for business owners to help
them pay less. It's part of what helps in average
customer cut expenses I about five percent per year, but
maybe less good news for people who try to discriminate
against our customers.
Speaker 2 (35:23):
We are in an age of cracking down on spending
and cracking down on wasteful spending in DC for example,
and people are talking about doge et cetera. And we're
going to really move the dial on spending. It's not
going to be wasteful spending. It's going to be We're
going to have to like actually have different priorities. Nonetheless,
cracking down on waste seems good from the perspective of
(35:43):
an expense management platform. What are some fingerprints of wasteful
spending that you see? Can you see into companies and
like see the fingerprints of waste? Do you have any
advice or things that you look for when if you
wanted to hunt waste in spending.
Speaker 4 (35:59):
I'm happy that people are thinking about this in a
real way because often there's an obsession of how do
you spend more? But if you want to make a change,
it actually comes.
Speaker 2 (36:08):
Other than putting the entire government on rant. So other
than that, I assume you support that, But what's the
next thing you have to do?
Speaker 4 (36:15):
Look, maybe one of the founding fathers, Ben Franklin, I think,
was known for saying a penny saved is a penny earned.
And if you look at reasonably efficient organizations, like an
average American company, they have a profit margin of about
eight and a half percent, So mathematically, a penny saved
is actually twelve earned is the same. And I think
about organizations like the government, which I have a lot
(36:37):
of empathy. They have a lot more complex constituencies and
needs than a profit making entity. But the exercise of
cutting costs has not been taken seriously for a very
long time, and it's led to very different behaviors. If
you want to sell to the government today, often with
the selection criteria is can you last through a one
(36:58):
to two year request for and RFP and process, which
is very different than how most companies and people select
for things. What has the most value, what's the lowest cost?
What can I try and see if it's done next.
The other very counterintuitive thing that's interesting about the government is,
you know, you would think that as one of the
largest buyers of anything in the world, we would get
(37:19):
that discount for all taxpayers if you're buying a million
licenses for a piece of software volume license. But it's
not that way. The typical way that government buys is
to pay sticker price and to not have discounts. And
so you see not only are different agencies paying very
different prices for the same sets of goods, but you
don't see normal common sense things of hey, we should
(37:42):
have a group discount for people in buying to last
a lot of the tools. Because procurement cycles are fifteen years,
you can't break contracts. You're paying full rates. So these
companies are going to sue you if you try to
go and do this. You start seeing some crazy things
in terms of the actual tools cells. The spend management
architecture that the government is using was primarily selected in
(38:05):
the early two thousands, and so whereas the private market
today can tap a card, their expense report has done
for them, their books are kept for them. The result
is that you have an incredible amount of waste of
people's time, of really hard working people in some cases
actually spending most of their time trudging through the bureaucracy
of old tools that don't work to each other. To
The most shocking I would say, is like you don't
(38:27):
have to look hard. There's several friends of different agencies
who in trying to learn and understand some of these
DOSEE efforts. A shocking thing I learned is that at
multiple agencies, four hours per night email is just shut down.
You can't send, you can't receive. It's crazy, right.
Speaker 2 (38:41):
I've seen that government websites have like a time.
Speaker 4 (38:45):
That's exactly right because they're using old private servers contracts
from thirty years ago. And so if you want to
talk about you know, you know, true efficiency, it's nowhere
close to the leading edge. And well, I think you're
exactly right. If you want to really make a dent budget,
you have to talk entitlements, you have to talk debt
service and what that's going to look like. But if
you want to go to this next level, great tools
(39:07):
that prevents wasting of time, that auto meet auditing of records,
so you don't have a Department of Defense that's failing
seven audits in a row and can't track where spending
is going. But actually it's all digital, it's all tied
to it. I think you need to get serious about
allowing people to pick best tools for the problems.
Speaker 2 (39:23):
Eric Lyman, thank you so much for coming on on lots.
I'm really glad we made this happen.
Speaker 4 (39:27):
I really appreciate it.
Speaker 2 (39:41):
Tracy, I really liked that episode. There's there's all kinds
of AI tools I need to now dive into because
like now I'm going to be the person who's subscribed
to nine different things and that's a little bit Yeah,
I know, like, what am I subscribed? I'm like, I'm
subscribe to at least like three probably more right now,
I probably you know now that I got to like
dive into these specialized coding tools. But no, that was
(40:01):
really fun.
Speaker 1 (40:02):
Uh.
Speaker 3 (40:03):
Have I told you the story before about my coding
class in high school seymore So. Actually it was just
a basic like it class, but as part of the class,
we had to program our own little application. And this
was like in the early two thousands, so it was
all very rudimentary. But our teacher commissioned us to do this,
(40:25):
and I built a fortune cookie program where you like
clicked a button, well the button looked like a cookie
and it gave you your fortune blah blah blah blah.
And at the end of the assignment, everyone turns in
their program to the teacher and he made everyone sign
a contract giving away the rights, the licensing rights to him,
(40:45):
and he said he did it to teach us all
a lesson about copyright and how your work is rarely
your own if you're working for a big corporation, which
is why I asked that copyright question because I still
think about that to this day. So it was a
good lesson that's extremely funny.
Speaker 2 (41:02):
But no, there was a bunch in there that I
thought was interesting. So one, you know, the idea that
like you can just like track like what percentage of
people like pay for something one month then also the
next month. Also this idea, and like we got to
keep like coming back to it, the lack of lock
in for some of these models. Right, We're really used
to there just being one winner in search, one winner
in social networking, one winner in e commerce so forth,
(41:24):
one winner in photo sharing. So it's really interesting to
think about like all this money going to models like
A where it's like really easy to move from one
to the other, a simple project B where there's open
source competitors that maybe are just as good. That seems
like a pretty big deal right there.
Speaker 3 (41:40):
Yeah, And I guess the big question is will there
eventually be some sort of winner that turns out to
be better at it than everyone else or is there
going to be space for that sort of specialized either
you know, specialized use case or the specialized like actual
interface for different jobs.
Speaker 4 (41:57):
Yeah, I think that's interesting.
Speaker 3 (41:58):
We talked a little bit about importance of interfaces totally. Anyway,
shall we leave it there, let's leave it there. This
has been another episode of the All Thoughts podcast. I'm
Tracy Alloway. You can follow me at Tracy Alloway.
Speaker 2 (42:10):
And I'm Joe Wisenthal. You can follow me at the Stalwart.
Follow our guest Eric Glyman, He's at e Gleman. Follow
our producers Carman, Rodriguez at Kerman Arman, dash Ol Bennett
at Dashbot, and Keil Brooks at Kelbrooks. From our odd
loots content, go to Bloomberg dot com slash odd lots,
where have transcripts a blog in the newsletter, and you
can chat about all of these topics twenty four to
(42:30):
seven in our discord discord dot gg slash odd lots.
Speaker 3 (42:35):
And if you enjoy All Thoughts, if you like it
when we talk about how much companies are actually spending
on AI, then please leave us a positive review on
your favorite podcast platform. And remember, if you are a
Bloomberg subscriber, you can listen to all of our episodes
absolutely ad free. All you need to do is find
the Bloomberg channel on Apple Podcasts and follow the instructions there.
(42:57):
Thanks for listening
Speaker 4 (43:03):
In