All Episodes

November 4, 2025 42 mins

Dan Balcauski speaks with Yaniv Makover, CEO and co-founder of Anyword, about the evolution of AI in content creation and marketing. They discuss the impact of ChatGPT, the competitive landscape of SaaS AI copywriting tools, and the strategic differentiation Anyword employs. Yaniv shares insights on the necessity for enterprises to integrate AI, the challenges teams face in AI adoption, and the role of data integration for improved marketing performance. They explore how AI is transforming marketing workflows, the importance of testing AI-generated content, and ensuring high-quality outputs. Yaniv also emphasizes the need for companies to quickly demonstrate AI's ROI while investing in future AI infrastructure.

01:06 Meet Yaniv Makover and Anyword
02:01 The Competitive Landscape of AI Copywriting
02:32 The Evolution of AI and Anyword's Strategy
03:42 Challenges and Solutions in AI Content Generation
06:39 The Impact of AI on Marketing Teams
08:45 Navigating the AI Boom and Market Shifts
13:40 Enterprise Focus and Product Strategy
17:51 Future of AI in Marketing and Business Processes
21:51 Challenges in Business Processes and AI Integration
23:25 Adoption Struggles and Successes with AI
24:32 Decentralization and Oversight in AI Content Creation
25:53 Balancing Quick Wins and Long-Term AI Investments
27:47 Navigating AI Development and Validation
33:09 Private LLM Infrastructure for Enterprises
39:24 Rapid Fire Closeout Questions

Guest Links

Anyword.com

Yaniv Makover on LinkedIn

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Yaniv Makover (00:00):
Then chat g PT happened, and then this is what

(00:02):
you're talking about,everybody's yeah, like you're,
you have a nuanced kindsolution.
What do I care?
Just give me AI to write my blogposts in which I, now I can
write like infinite amount ofblog posts.
okay, like AI is not gonna be asgood as me.
I'm always gonna be better thanai.
What this is just not, this isGPT-3, whatever.
That's not the conversationanymore.
So it's pretty clear to everyonethis is the future.
Call to action is that if acompany doesn't adopt AI in the

(00:23):
next five years, they're gonnabe at such a disadvantage.
They might die, right?
if you spent, three hoursprompting the ai, why don't you
just write the thing yourself?
It's a cliche, but really like,it's like, there's so many bad
things are gonna happen to thecompany and to you and you
really need to shrug it off

Dan Balcauski (00:59):
Welcome to SaaS Scaling Secrets, the podcast
that brings you the insidestores and leaders of the best
scale up.
B2B SaaS companies.
I'm your host, Dan Balcauski,founder of Product Tranquility.
Today I'm excited to welcomeYaniv Makover CEO and co-founder
of Anyword, an AI copyrightedplatform that helps performance
marketers write higherconverting content.
Yaniv has scaled Anyword to overa million users and he brings a
unique technical background tothe AI space with a master's in

(01:20):
information systems and datamining from Ben Gurion
University Yaniv.
Welcome to the show.

Yaniv Makover (01:24):
Thank you for having me, Dan.
Excited to be here.

Dan Balcauski (01:27):
I am excited for our conversation today.
Before we dive into your scalingjourney, can you give us the
elevator pitch?
What does Anyword do?
Who do you serve?

Yaniv Makover (01:34):
Yeah.
We serve enterprise marketing

Dan Balcauski (01:36):
I.

Yaniv Makover (01:36):
We help them get better results from the AI that
they're using.
So whatever AI they're using,open ai, Gemini if they use it
with Anyword, then Anyword toplug in lots of data.
IB tested data into lms.
Make sure that the content,they're generating social posts,
ads, emails.
better for their targetaudience.
And the promise is basically a15, 20% lift in results.

Dan Balcauski (02:01):
Well, Anyword it operates in what might be one of
the most competitive spaces inall of SaaS right now, AI
copywriting.
You've got, the foundationalmodels themselves Jasper, copy,
ai, dozens of others alllaunching around the same time
all realizing that, AI poweredwriting was going to be a big
market.
I guess when you looked at thatlandscape, like how did you

(02:22):
think about the.
Sort of strategic challenge ofcompeting in the space where
everyone had access to thesereally powerful underlying
technologies.

Yaniv Makover (02:31):
Yeah.
When we launched Anyword, thisis like before, before chat, GPT
and

Dan Balcauski (02:37):
after chatgpt

Yaniv Makover (02:38):
launched Like my mom phoned me was like, oh, you
heard about this thing?
I was like, mom, I've beenworking on this for like six
years about LLMs and ai.
Um, so it was annoying.
But but really thinking aboutwhere Anyword belongs in the
ecosystem.
Like it's something we startedway, way early.

Dan Balcauski (02:53):
Hmm.

Yaniv Makover (02:53):
we was like always big believers in the fact
that LMS are gonna becommoditized.
Everybody's going to be usingthem and it's gonna be
everywhere.
then how do you build a productin a mode?
We, like my first company that Istarted was all around
performance ads and content andpublishers, and just basically
learned the power of creative.

(03:16):
And the data aspect that goesinto creative.
So like, there are just betterways of saying the same thing.
And there's and in marketing 10%better is really what's gonna
either make or break yourcampaign.
The dependent on what your goalsare.
And so, we can build Anywordwith the notion that AI is gonna

(03:38):
be free and everywhere.
But addressing the main.
I would say gap that AI has andthe ga, the main gap is that
even though AI's been trained onthe entire internet, or some
teams have their own first partydata it's not enough for it to
know actually what works.
So, and a LM will rate 10amazing emails.

(04:00):
One of them is gonna workbetter.
It just doesn't know.
It can hallucinate and itdoesn't know because it hasn't
seen enough data.
It doesn't know.
And so we've actually testedthis out.
Like if you take, gPD four oh.
And give it a pair of texts andask it to guess which one work
better.
It won't know, it will know like60% of the time.
It's not because it doesn't it'snot smart enough.

(04:20):
It's just that hasn't seenenough data.
And so we thought, okay, that'ssomething that if we solve, we
just, it won't be affected bythe evolution of ai.
So, saying that then building aproduct around it and a strategy
is something different.
But that's how we thought aboutthis problem.

Dan Balcauski (04:36):
I, I'm curious 'cause you say like these LLMs.
I, I've had very similarexperience, right?
Either trying to write emailsor, I do my own sort of social
marketing organic, LinkedInposts or, or anything else,
right?
So I've gone through some ofthis as well, right?
And you can ask any of the momodels for, Hey, improve this
and it'll give you completelydifferent directions of

(04:57):
feedback, so, it could be, alittle bit scattershot.
You're like, wait, did, but I dohave.
I, I am surprised to say likethey haven't seen enough data,
because as I understand it,right, these things are trained
on like the corpus of all humanknowledge.
So I assume that you're usingdata in a very specific way when
you say that they haven't seenenough data.
Mm-hmm.

Yaniv Makover (05:17):
Yeah the, that's exactly the problem.
They've seen the corpus of thehuman knowledge to, let's say
you have two versions of thesame social post on LinkedIn.
You want to post about, I don'tknow.
This podcast, like thisrecording or having me.
the only way you can actuallytell a model what works better
is you have to show itcomparisons, like actual ab

(05:37):
tests because say you compare apost that you did about another
recording with me and withanother guest, and that guest is
way more interesting than me.
It has more, brand just, just abetter looking person.
Then the model can't actuallygeneralize because that data, it
wasn't really set up in where itcan learn what actually works

(05:58):
better.
Now when you're, they're tellingthe LM to generate like 15
different ations for a postabout our recording or our
session you can only tweak likethree words.
It's still gonna be me, sothere's like 15 variations of
the same post, and then thosewords really matter.
And so to train a model youreally need to have.
specific and very high qualitycomparison data.

(06:18):
And without it, still knows,like it knows like, like I said,
64% of the time, which is like14% of random, but it's not
gonna get to high accuracy orhigh enough.
And in marketing, like I said,even 10% lift is everything.
So,

Dan Balcauski (06:34):
hmm.

Yaniv Makover (06:34):
not that LMS don't know they know, they just
could know better.

Dan Balcauski (06:39):
I'm curious going back to, I'm sure your mom is a
very smart woman, obviously you,you share the same bud line.
But I imagine that experiencewas not unique to her.
Right?
Where, everyone had this sort ofchat pt moment where they
realized like, oh, this is athing.
And I'm sure that, definitelyjumped that, gap between a very
specific group of technicalinsiders, mostly s centered
around silicon valley or, thetype of work that you were doing

(07:01):
previously.
And all of a sudden everyone'sjust like, oh, well there's this
tool out there.
So I'm curious, like, it soundedlike you were prepared from the
beginning of a company that thiswas gonna be, widespread.
What, I guess what went throughyour mind when, you realized
this sea change of like, okay,now my mom understands what this
is, but I, I'm sure a lot ofteams be, there's like, well,

(07:22):
the, the base bottles are, arewell enough.
Like how did you think about,okay, how do we carve out our
space?
Within this world to really tellthis story because I, I could
imagine that's a very, it couldbe a very nuanced story to, to
tell, and maybe people are justlike, oh yeah, the underlying
models are just good enough andnow I can just create content
and mass and maybe I just,create a hundred campaigns and I
run my own tests.

(07:42):
It's like, how did you thinkabout that challenge?

Yaniv Makover (07:46):
Actually, it's a good question, I think I divided
into like three phases

Dan Balcauski (07:50):
Hmm.

Yaniv Makover (07:50):
GPT, first year, like during Chad, GPD, and then
what I would call this phaseright now.
Before Chad G like, before Chad,GBTI had to go to like 50
different investors and convincethem that AI is gonna be writing
stuff.
And I had like, back wheremodels weren't that big.
There was like a model calledBerts and then GT two and then

(08:11):
GT three.
And then GT two would like spitout like five different answers.
Some of them nonsense one wasgood.
So if you do two plus two equalsand you ask GT two what it was,
it would go four.
It was like, that's amazing.
Okay.
It learned math from words andthen four plus four, and it
would go like nine equals nine.
So, so, just had to get like oneinvestor out 50 to believe that

(08:32):
this is the future.
Forget about

Dan Balcauski (08:34):
Mm.

Yaniv Makover (08:34):
for a second.
That was like everybody wastalking about blockchain for the
past 10 years and then convinced'em about this problem.
So that, that was the pre, Iwould say pre-chat, GPT, just
getting marketing teams toactually use ai.
Then chat g PT happened, andthen this is what you're talking
about, like everybody's like.
Yeah, like you're, you have anuanced kind solution.
What do I care?
Just give me AI to write my blogposts in which I, now I can

(08:55):
write like infinite amount ofblog posts.
And this was like, I would saylike a big part of the AI
explosion, but for actually forus we're still small, so it
kinda like helped us because nowwe couldn't, we didn't need no
need to convince any marketingteam that AI works.
They all knew that it works now,it

Dan Balcauski (09:11):
Mm.

Yaniv Makover (09:11):
like some enterprise solution.
as the, I would say the, themarket is progressing.
It's actually easier and easierfor us to say, Hey guys, you can
ab test 500 emails here I justwrote for free.
And this is just, it'simpossible.
How do you know you're using theright version?
There's no like person standingin the way.
And also I think marketing teamsare feeling like the, actually

(09:32):
the degradation of performancewhen they're using ai.
So they used to have an expertand now they're using like
everybody's using AI and they'renot experts.
So they can't even edit or vetthe outputs.
And so I think it's easier forus our story resonates more.
It's like, just like, okay, AIworks.
Got it.
We don't have to spend anythingabout that.
You're already creating contentwith ai.
Now let me tell you how it willwork better, and and you'll know

(09:56):
this before you publish.

Dan Balcauski (09:58):
So I think the one thing I heard, and you can
correct me if this is amisinterpretation, is this move
from it's, there's a step thatis producing the content, but
what I heard you outline thereis like, it's not just producing
it but taking it through theentire workflow.
'cause you might be able toproduce 500 pieces of content,
but then do you have the.

(10:19):
Capability to then actually testall that versus thinking through
the production into thevalidation workflow.
And so the, the base level LLMsreally only give you that first
step and leave you without a wayforward on that.
On that follow on parts of the,of the, completing that whole
task.

Yaniv Makover (10:35):
Yeah.

Dan Balcauski (10:36):
is that correct?

Yaniv Makover (10:37):
you're right about what you were saying, but
I actually

Dan Balcauski (10:38):
Mm-hmm.

Yaniv Makover (10:38):
else.
I meant the fact that AI, youwould have a person maybe
writing three variations of theemail you wanted to send out for
a campaign.
And you test those emails andyou send those emails to 10
people a time.
And then the best email youwould actually send it to to
everyone.
So I'll give you an example.
For my first company, one of thecustomers was in New York Times,

(11:00):
New York Times would.
eight variations of a tweet.
those variations out.
Send the best version out as theactual tweet to everybody.
That's what they do.
So the AB tested those eightversions.
Now you have ai.
So you can tell AI had to writea thousand variations of the
same tweet and they'd beslightly different.

(11:23):
It is impossible to test athousand variations.
It costs a lot of money and ittakes a lot of time.
So now everybody has the powerof ai.
But it can actually hurt youbecause testing costs, time and
money, lots of time, lots ofmoney.
You can't test everything, andyou really need to know what the
best variation out, out thereis, or one of the best

(11:44):
variations to use that.
And when the more you use ai,you can like get a feeling for
that.
Hey, I could use this version, Icould use that version.
They're all great.
Marketers care aboutperformance, like they care
about stuff like this is gonnawork the

Dan Balcauski (11:56):
Hmm.

Yaniv Makover (11:57):
And so I feel like it's an easier and easier
story to tell.

Dan Balcauski (12:01):
Got it.
Got it.
So, so this shift from how theywere doing things as well was
part of that story.
So, so you, you talked aboutthese kind of three eras of this
sort of pre-chat gt where you'retrying to convince, one out of
50 investors, like, Hey, theseunderlying technology's gonna
progress, it's gonna get better.
And then you, there's this sortof chat GBT era or like around,
when, when that launched.
So was that, was that late?
2022, I think the times allcollapsed post COVID.

(12:23):
I, it is hard to separate theyears.
So I'm curious, like when thathappened and you and people
realized that AI was a thing,like I guess was your plan that
you kind of had, had beenpursuing.
Did you realize then like thatwas going to be enough to
differentiate you or did thatrequire a retooling of your
strategy in order to separateyourself from the rest of the

(12:43):
pack?
Who realized like, oh, like AIwritten copywriting is gonna be
a thing all over and all of asudden you have.
10 new competitors, probably notnet new.
I think right before, probablysix months before the chat GBT
launched, actually the Jasper'sbased here at Austin, and I
randomly had was meeting somefriends for drinks and like ran
into those guys like six monthsbefore and they were like,
hyping this up.

(13:04):
And I, I, I didn't know anybetter.
And so they're like, oh yeah,we'll give you a free account.
And I test it out.
But it was like, it was.
It was GPT-3 and it was likeexactly the experience you just
denoted, which it was like,yeah, you generate 10 versions
and like nine of them weregibberish and like one was good
and I was like, yeah, I can'treally use this to write a blog
post or anything, but like allof a sudden, three, five and
then four, comes after, afternow you're like, okay.

(13:25):
People realize the thing.
There's a bunch of attention,bunch of VC money in the space.
Did that require any sort ofretooling of your overall
approach or, or were you guysset up with.
The strategy you had in placebefore that to navigate that
landscape.

Yaniv Makover (13:40):
I think for me and for us, it's like, okay, you
have conviction about what AI isgoing to be and what are gonna
be the problems that you need tosolve for it to

Dan Balcauski (13:48):
Mm-hmm.

Yaniv Makover (13:49):
us it was always like, okay, we need to close the
feedback loop for AI to make itbetter and make it like work,
work better.
And that's gonna be a challenge.
It doesn't matter if it's AI orlms, orent, lms.
The model wasn't trained on ABtests.
You need to tell it what ABtests work.
And so I always knew that, orlike we always had, and I still
know that that's my conviction.
That's our conviction about thisis what we're trying to solve,

(14:11):
this is how we're gonna make AIbetter.
Tooling and product strategy anddifferentiation are different,
right?
So before chat, GPTI would goand do this and Jasper did an
amazing job just like creatingall these templates.
This thing writes an email.
These just making ai easy to usefor everybody, for SMBs and
enterprise and stuff like that.
And then, so what happened to uswas after Chat GPT this actually

(14:36):
make everything better for usbecause we could just focus now
on enterprise.
Who cares about performanceEnterprise?
SMPs just care about, like, thisthing writes really well, so
this saves me a bunch of time.
That's not

Dan Balcauski (14:46):
Mm.

Yaniv Makover (14:46):
This last 10%.
Of the conversion of the landingpage, right?
So,

Dan Balcauski (14:51):
Mm-hmm.

Yaniv Makover (14:51):
Really care.
So it actually allowed us tofocus on that segment and also
focus on our part.
Like you don't have to createthousand different templates,
like you can do anything with aprompt, right?
So that thing just basically theplaying field for any that
creates content likecopywriting.
And then a, people use Anywordwhen you prompt with Anyword it

(15:15):
will tell you what talkingpoints to add to the prompt.
So the AI knows what works foryour target audience, for your
channel and for your goal.
So whatever you're trying to do,and then when you generate a
bunch of results from the LLM,doesn't matter what element
you're using, you're gonna getlike 10 versions that we'll rank
them for you and you can test itout.
So our biggest customers, Iwon't say which, like are the

(15:36):
LLM Foundation Model Marketingteams.
use Anyword with their models tomarket their own thing, just to
rank and stack that, that thing.
So the conviction never changed.
Definitely the product kind theway the product worked.
The emphasis on how needed towork had to change with with the
market.

(15:56):
Now it's all about agent.
So like, it's not like thatchanges our vision, but we'll
definitely have to be, a serverin the agent work and playing
that, that role.

Dan Balcauski (16:06):
You, you mentioned in that.
Arc that you realize, theenterprise, really cared about
this delta in performance ofone, set of copy versus another.
I guess was this did that shiftin the broader market?
Were you guys already purelyfocused on enterprise?
Did that make that more crispand clear for you or was that

(16:26):
something that was already, youalready knew from the before
that happened?

Yaniv Makover (16:30):
So ai, it's a good, AI is such a unique
market.
We already, we always knew thatenterprise would be like, what
where we needed to go, but justcouldn't do it.
Like people didn't have ai, soyou had to like sell them ai,
like they didn't have Chad GPT.
You had to like

Dan Balcauski (16:42):
Mm

Yaniv Makover (16:42):
G PT three or GPT three four.
That had to be part of yourproduct.
And then the early adopters areSMBs and you're talking about

Dan Balcauski (16:50):
mm.

Yaniv Makover (16:51):
sold to, dunno, millions of SMBs.
They did such a great job.
And so there was no gettingaround, PLG and like no getting
around like, SMBs to get theenterprises the market is mature
enough where if you are talkingto an enterprise marketing team,
they're already using JGPT orsomething, or whatever, some
version of that.

(17:12):
they just need to step up theirgame, right?
So they're, they already knowDoing.
It's easier for us to show them,Hey, this is AI and this is
Anyword.
And and yeah, we always knewthat that's where we're going,
but we couldn't do it.
We just had to like serviceSMBs.
In the beginning, we had to dothis whole, like, everything
that you needed to get becausemarketing teams for enterprise
weren't using it.
They weren't gonna buy it.

Dan Balcauski (17:31):
That's that's interesting.
That's counterintuitive fromwhat I would've expected in
that, like the enablement ofthis broad based, technology
actually sort of made it, madeit easier to, for you to
penetrate those markets becausethey now had, were familiar with
the underlying technology whenthey were not necessarily the
early adopters upfront.
I'm curious like you've been so.

(17:51):
You've hit at this a coupletimes where like, AI is
fundamentally changing howmarketing teams, operate and
you've had a front row seat towatch this happen, probably
across your customers in yourown organization.
I just, what have you observedchanging and how, marketing
teams are structured in terms ofhow they're, creating content,
how work actually is gettingdone?

Yaniv Makover (18:12):
So, first of all I think we're like really early.
I think it's work in progress.
I think everybody's I think it'sbeing mandated by boards and
management teams to like adoptAI.
Every that uses Anyword uses itdifferently.
Workflows are still different.
It's clear that they, peopletrust ai.

(18:34):
We're.
Like we were talking about thefact that enterprises were like,
weren't the early adopters ofai.
There would usually be someperson there that is like the
chief content person, whatevertheir title was, and they would
be reluctant this thing is like,not gonna be as good as me.
And that would be a validconversation.
Like, okay, like AI is not gonnabe as good as me.
I'm always gonna be better thanai.

(18:54):
What, like, this is just not,this is GPT-3, whatever.
That's not the conversationanymore.
So it's pretty clear to everyonethis is the future.
to get there is not clear,right?
So

Dan Balcauski (19:04):
Hmm.

Yaniv Makover (19:05):
whole conversation around agents, like
what an agent can and can do.
if you cannot define thebusiness process easily and
accurately, then how can theagent do it, right?
So I could see much faster thanagent managing your website or
your social page.
Then managing your kind ofcustomer life cycle, marketing
whole thing.
That's much more complicated,the more channels.

(19:27):
The more kinda like,dependencies there are, it's
much harder.
But I definitely do see like,there's easy things to do, like
creating change, like Anywordnow has an agent that changes
your copy on your website allthe time.
Just tests.
It changes it and it evolves andmakes better decisions shows
different copy, differentpeople.
That's super easy.
Like it doesn't even break.
Your CMS Deja knows what,there's a goal.

(19:48):
And it knows how to optimize forthe goal.
It doesn't have to like getapproval from 50 different
teams.
But that's a whole differentthing than like, affecting your,
your branding your positioning,your strategy.
And and I feel like I'll giveyou an example like where AI has
been trying to solve thisproblem for.
I dunno, 15 years, even beforegenerative ai it's not gonna be

(20:11):
solved.
Even with generative ai, likepeople would think it's super
easy.
Why doesn't AI manage yourbudget between your search ads
and your social ads?
Right?
You have a budget for paidadvertising there's no AI that
will take money away from onechannel to the other.
There are like been plenty ofattempts, plenty of tools, but.
You'll never see a paid ads teamactually

Dan Balcauski (20:32):
Yeah,

Yaniv Makover (20:32):
that.
Why?
That should be a simple enoughproblem,

Dan Balcauski (20:36):
I I, my, my pet theory would be, it would be
because attribution is sodifficult at attribution
channel, attribution, likethere's any number of ways to
calculate it, and no one waycaptures every sort of nuance.

Yaniv Makover (20:51):
but why is a person better at it than the ai?
Like they don't know that you're

Dan Balcauski (20:56):
That's, that's, yeah.
Fair enough.
Yeah.
I, I, yeah, I, I guess nobodyhas any idea I what do you think
is going on there?

Yaniv Makover (21:03):
I've thought about this a lot, right?
This is like before a lens.
Okay, so if you could sell yourproduct a hundred times for$20
profit, or 50 times for$30profit, sometimes it, it's like
I can calculate the ROI and Ican do that.
There's lot, so manydependencies on where you can
end up like that And thosetendencies change.

(21:24):
Like there's seasonaldependencies and there is like,
also sometimes you, you, youwanna be more aggressive with
the strategy'cause you're paidads, support your branding.
There's so many things that in aperfect world you could define
for the ai and it could do it.
It's just very, very hard todefine.
Like, and it's like,unstructured.
It's not really wellarticulated.

(21:45):
There's just a lot of that stuffgoing on in every, and the work
that we all do in in, in thebusiness place.
And it's even in a simpleproblem about how to spend your
budget between and Google, whichshould be super simple.
Doesn't happen because there'sso many contractual things that
you need to take a intoconsideration, cash flow issues,

(22:06):
whatever, ROI, attribution, allthat stuff.
That you definitely have aperson there and like, okay,
I'll just make the decisionevery week, or something like
that.
So, and I think that is a goodanecdote for almost a lot of the
business processes in thecompany.
So people are worried they'regonna lose their job and the
guy's taking over.
It's really hard to define evenmy job, like your job, like I
wake up in the morning, I don'teven know, like every day, what

(22:28):
was my.
My objective, my functionobjective that I like, okay, I
only need to do this.
I have like seven competing onesand sometimes I don't feel like
doing this one, and like maybe Ido the other ones and it's not
really well defined.
So I think it's hard.

Dan Balcauski (22:42):
Well, it is definitely challenging and I
think, so, within there, I thinkyou, you.
Something I'd pull out for thelisteners just is that it's
definitely something where likethe closer you are to an
observable metric, the ais are,are better able to take over
that task, right?
But much like anything in, in,in business of any complexity,

(23:03):
like how much is our brand worthand how much should we invest in
brand or, where should we putour positioning?
Those don't necessarily havevery clear next touch goals that
we can, we can measure againstas we, as we tweak things.
Because you're, you end up in a,a very complex landscape that's
hard to, navigate or boil downto any, any particular set of
numbers.
I'm curious'cause you, youtalked about, it's still early
days and, and I, completelyagree with that.

(23:25):
In terms of how teams areadopting ai, I'm curious, what
are you seeing teams like moststruggle with as they try to,
build work ais into theirworkflows?
Right.
'Cause yeah, just as, as withwith where Anyword is playing.

Yaniv Makover (23:41):
So I think who gets to use the AI to create
content?
Like the, if you were like theemail person in the marketing
team or the, even the adsperson, you did not create your
content.
You would get it from anotherteam that created a content for
you.
So there would be like acentralized team copy team,
content team, whatever you wannacall them.

(24:02):
would be like the serviceproviders for the rest of the
teams.
I think now, like, with runningthat, dude, I have this ai, I
can just like.
Spit it out, and should I do it?
And then what kind of oversightdo I need?
And if I have this oversight, amI leveraging the ai?
So I feel like there's this likere reconnect changing of the and

(24:23):
I'm talking about it like veryspecific, like from a content
perspective, like how Anywordsees the world.
So I, I know people have like amuch bigger kind of view of how
AI is changing the workspace.
But from a

Dan Balcauski (24:32):
Yeah.

Yaniv Makover (24:33):
there, there's a team responsible for every
channel and, now they'recreating a ai, so it's
distributed, not centralized.
And then how do you make surethat they all work in sync?
And now they're also wanting tohave agents do that.
And so how does agent, how doagents work with people?
So it's pretty pretty early daysabout how do you approve the
content how do you make surethat everybody uses it

Dan Balcauski (24:56):
A shift so a shifted the roles and
responsibilities even among sortof team members is yeah, I can
imagine that's tough tonavigate, right?
'cause overnight.
It's like you have capabilitiesat hand, right?
Where, there's definitely atrade off as you involve
multiple groups or multiplepeople in any workflow, there's
handoffs and coordination thathas to happen and then delays.

(25:16):
And you're like, well, if Icould, I could just turn around
and chat GPT and ask for 10versions of copy and just pick
one and, and get on with mylife.
Or I can, submit this request tothis other team and then wait a
couple days to get the.
Something back and then realizeit's not exactly what I wanted
and then go through thatprocess.
So I can imagine just yeah,there's, there's not necessarily
a clear path on how you resolvethose type of tensions in a
business as, thoseresponsibility shift around.

(25:38):
I'm curious, like, as and maybethis is related to that point
or, or something else, I guessthe, what are you seeing still
being early days, but I'mguessing you're, you're, you
have customers where, you know,some of those customers are
definitely, able to adopt betterand faster and, and have more
success than others.
I guess in the companies thatyou see where they are able to,
really adopt these tools andworkflows successfully.

(25:59):
Like what did you see them doingthat maybe other companies
aren't like these the leaders,the pioneers, the folks who are,
who are able to really makethese systems work in their
organizations.

Yaniv Makover (26:10):
Okay.
The main comment.
Call to action is that if acompany doesn't adopt AI in the
next five years, they're gonnabe at such a disadvantage.
They might die, right?
They might

Dan Balcauski (26:18):
Mm-hmm.

Yaniv Makover (26:19):
have to do it.
Everybody knows it.

Dan Balcauski (26:20):
Mm-hmm.

Yaniv Makover (26:21):
that I think I see that they should be doing
one, you need like quick wins.
Like somebody needs a reason touse ai.
If you just, like most peoplejust have a hard problem, like
just challenge adopting it,like, how do I use it?
let's say it gives you some liftin your ad spend or your ad
performance, that's great.
That's like you need quick winsor your blogs work better,

(26:42):
whatever.
Like, that's something.
But the other thing is youshould invest in your AI
infrastructure.
So enterprises need to connectAI to their data sources.
They need to understand how the,I wouldn't say the five years
from now, but two years fromnow, how their ecosystem, their
vendors will look like.
So you need to spend some timeand budget on that, but you also

(27:03):
need to show quick wins.
So you can't just now get into aproject where it'll take you two
years to map out all your datasets, all your workflows, all
your use cases define who doeswhat.
You just need to start using it.
So the main thing is like mostpeople on your team don't know
how to use AI there.
Maybe they use Chad GT ifthey're younger or if they're
kind of like, if they, if butmaybe they don't know how to do
that.
And it's a pain for them.

(27:24):
It's not doing what it needs todo.
So I would find the use cases.
Usually it's in, customersupport and it's marketing.
It's like there's easy marketcontent stuff that you can do.
Social posts, ads refinements ofexisting copy show immediate ROI
to people, so they like believein it.
That'll make them better attheir job.
And think about theinfrastructure you need to start
building and start building it.

(27:45):
So I feel like

Dan Balcauski (27:45):
Hmm.

Yaniv Makover (27:47):
the more advanced, successful teams

Dan Balcauski (27:49):
So there's a Go ahead.

Yaniv Makover (27:51):
a balance.
I see.
I see some companies investingin infrastructure and they'll
just, again, like a two yearproject and nobody's using ai
and it's, I don't I, I, I thinkthey're gonna struggle and it's
not moving.

Dan Balcauski (28:02):
Yeah, I, so I, because I've heard of a couple
different patterns.
I have a friend who works at oneof these, one of the major tech
firms.
And so he's become a, he'ssomewhat been anointed, but also
he, he had a proclivity for itof a, one of these AI front
runners.
And so he's tapped with runningaround the organization giving
talks, giving assistance oflike, Hey, here's the surfacing

(28:24):
the use cases.
Here's the, here's the way toorganize around it.
Here's, how.
Prompt, better, et cetera.
And so, I, I, it's, in any largeorganization, talking about,
difficult to find metrics to runan AI against, it's difficult to
understand if that, is that theway or is it, you carve out,
Hey, we.
We have a, we have a team andwe're, going to run them, in a

(28:44):
very different way for a type,to make sure that they can iron
out all the processes before wetry to push this organization
wide.
I'm curious if you've seen anypatterns like that as, as
marketing teams have tried toadopt at least the kind of
workflows that Anyword isinvolved in.

Yaniv Makover (28:59):
Yeah.
I think the patterns I've seenis like, let's try it out.
Let's like use something And,perfect our prompts or make can
like, just start doing thatcontent for marketing is
actually easy for you.
Like that.
Maybe coding and support.
So, so I feel like there's a lotof low hanging fruit in
marketing for creative,generation.
It's gonna be much harder to getan agent to manage your kind of

(29:22):
like life cycle and morecomplicated things like that.
So I've seen that.
I also seen struggle, likepeople are like using AI and
they, it's not giving them whatthey want, so they don't know
how to prompt and they know theyshould be.
So it's not that I, I don'tthink adoption.
going as smoothly as peoplewould think for an enterprise.
It's like

Dan Balcauski (29:39):
Yeah.
Yeah.

Yaniv Makover (29:40):
for content, which is not that difficult,
just write me an email about thenext thing.
You have formatting and you havestyle, and you have tone of
voice, and you have brandvocabulary, and you have all
that.
And then you, and this thingneeds to work.
if you spent, three hoursprompting the ai, why don't you
just write the thing yourself?
And and then.

(30:00):
so that gap has to be, is stillnot solved yet, I feel like,

Dan Balcauski (30:04):
I, so, I, I curious the, the challenges that
you mentioned before with,shifts in roles and
responsibilities.
I'm curious how if at all you'veexperienced that internally in
Anyword.
Like, like as these technologieshave gotten, very powerful, like
how are you as a leadermanaging, right?
Like all of a sudden everyonehas a, a.
A peer or an assistant that theycould ask, right, where they may

(30:26):
have used a shared servicesorganization.
How are you as the leader of, ofyour own organization, like
helping the team navigate thosesituations.

Yaniv Makover (30:35):
Yeah, so, so first of all I would say the
really easy things to do is likeany.
Analysis, an analyst type ofjobs in the org, which there's a
ton of these stuff.
Like, Hey, let's just like

Dan Balcauski (30:48):
I.

Yaniv Makover (30:48):
this Excel.
Any question you have to ask.
Like, it's a huge time saver.
So, just getting, and becausewe're like an AI product
company, like it's pretty easyfor people to use that.
So just like that happens insales, that happens in, in, in
in marketing.
It happens in in, like that'sthe easiest thing you can do,
just like, and then from, I sayproduct RD perspective, I think

(31:13):
AI has changed the developmentcycle.
And some, and for me there is abig emphasis inside the company
to how to develop in this new aidevelopment cycle, which is
different.
Like you, there's tons of thingsyou can do with AI that would,
took just really easy to do thatwould've taken you, I don't
know, months.
From an RD perspective, but italso changes the way you

(31:35):
validate the products for yourcustomers.
like how fast you can getsomething to them.
Does it work?
Sometimes does, and AI isunpredictable, so you have to be
really good at, mapping out usecases as opposed to like,
standard software where.
This thing does three things andthat's it.

(31:56):
I need to test those threethings.
How do you test ai?
That's really, really hard

Dan Balcauski (31:59):
Hmm.

Yaniv Makover (32:00):
Like you don't know what input is gonna be in
there.
And then, so I feel like that isa big focus for us.
On the marketing side, obviouslywe use Anyword to create
content.
It's like we, we, we don't dogfood and to measure results.
Our website is managed byAnyword.
Our website content.
And we publish content withEdward all the time.
So that's kinda like low hangingfruit for us on the coding side.

(32:23):
We're using some ai.
I wouldn't say that we are likeready to basically write all of
our code with with with ai.
I don't push like, I don't know.
Some teams, companies say, Hey,you just need to do that.
I think coming from developmentside, myself.
You really hate somebody else'scode.
It's like living in somebody's,a developer's worst thing is

(32:44):
like to fix somebody else'scode.
It's like living in somebodyelse's house.
It's like, it's not comfortable.
So I'm very aware of that.
So it doesn't matter if it's ai,another person not pushing like
wants to use ai fine, if not,also fine.

Dan Balcauski (33:01):
I, I wanna pivot a little bit because you've,
you've mentioned the, obviouslythere's the underlying
infrastructure and AI firstcompanies are.
AI first and any AI enabledcompanies are making some pretty
significant infrastructuredecisions now correct me if I'm
wrong, it looked like from, atleast what I could glean from
your website, that you haveprivate LLM infrastructure
that's isolated from externalproviders.

(33:22):
Is that correct?

Yaniv Makover (33:24):
Yeah.
We provide that to ourenterprise customers.
Yeah.

Dan Balcauski (33:26):
So, so that seems like a pretty massive strategic
and technical investment.
I guess what sort of led to thatdecision?

Yaniv Makover (33:35):
Yeah, the market, right?
So if if comic a customer islooking for a private model,
then they don't wanna share itwith the foundation models, then
we'll, we will do that for'em.
So it was mandated actually.

Dan Balcauski (33:48):
So totally market driven.
There was never a thought inyour mind to have it another
way.

Yaniv Makover (33:53):
It's not that we it, it, it's like it's table
sticks.
It wasn't that something thatdifferentiates the riches us
Anyword at any point, or itdoesn't give us any value.
It's just like, no, like thecustomer saying, Hey.
I need to know that you'rehosting this and I'm not sharing
my data with anybody else.

(34:15):
And so we went ahead and didthat.

Dan Balcauski (34:17):
So I, I'm curious 'cause like when you're, when
you're building a product thatdepends on AI models I think the
AI models are a bit differentthan maybe, like nobody
necessarily cares that yourbackend of your SaaS databases,
Microsoft SQL Server or Oracleor any other database system.
But, potentially that may, berelevant to customers, and you

(34:39):
have to make a decision of, howmuch of that to expose to
customers versus abstract away,do you let customers see or
choose which models are poweringtheir content, and how do you
think about that balance?
Mm-hmm.

Yaniv Makover (34:51):
Yeah, they can choose either their own internal
model or like external model.
Like it's a bit complicatedbecause there's like generating
the content, which we cancompletely decouple from.
You can give us your, just useGPT or whatever, or, and then
there's like ranking andsorting.
So there's like differentaspects of the product that we.
Can plug different models into,so for instance, if chat GPT

(35:13):
creates five versions of anemail, you can send it to
Anyword that Anyword will rankthem.
And then the, Anyword model willbe, it's also an LM but it's
hosted, it's perfectly hosted,but you can use whatever model
to generate those variations andprompted.
And so, we were capable, carefulto decouple from foundation
models'cause a lot of companiesjust use those.

(35:35):
Maybe enterprise GPT, butenterprise or Gemini, but still,
financial models and some wouldprefer a private model.
But for us, because we don'twant to compete with an lm,
we're not in the LM business weneed to be orthogonal to that.
And so we do that.
I think the way I see it really,I think in five years.

(35:56):
If you're gonna be a vendorselling to an enterprise company
or any company, they're gonnahave their LLM, they're gonna
have two or three LMS pluggedinto their data with

Dan Balcauski (36:05):
Mm-hmm.

Yaniv Makover (36:07):
and you're gonna have to work with that.
They're not gonna use 50 lmsthey're gonna, they're gonna
have three like data

Dan Balcauski (36:14):
I.
Are you are you familiar withthe company Clay, the like,
revenue account enablementsoftware.
Anyway, so, so, I, I used Claybefore and so they have their,
like AI agent tools.
So you could say like, Hey, goscrape the website of this
company and find out if theyhave a pricing page or if
they're B2B or B2C stuff that.
Necessarily wasn't stored inclean schema data that you might

(36:36):
buy from dun and Bradstreet or,or LinkedIn before, right, that
you can now use this, these LLMpowered agents.
And so, one of the things isthat in their.
In their user interface, you canselect, they have their own
homegrown models.
And then you can also select,you wanna use Claude or Gemini,
or which version of chat GBT youwanna use four oh or, or five
or, or, any, any selection.

(36:57):
And so I guess, there's, there'sa power to that, but it also
could be a little bitoverwhelming in that, when I'm,
I'm like, okay, I need an, Ineed this thing to go scrape
this company page and see ifthey have a pri public pricing
page.
I don't know which one of thesemodels is gonna be the best at
it.
And they have credit prices foreach one.
And so I'm curious, like how doyou think about that decision in

(37:20):
terms of what to expose tocustomers and how to guide them
in, in that choice?
Because I can imagine, mostpeople are not.
Technical don't really under, Ithink one of the big, I, I dunno
sure when this will releaseexactly, one of the big ahas out
of the, recent ch CHATT five ohrelease is that, most people
were who using chatt were usingthe four oh models.
Most, most people had never usedthese reasoning models.

(37:40):
And so one of the big probablyunlocks the majority of people
think five is amazing is'causethey'll finally get routed to a
thinking model and be like, ohmy God, five is so much better
than what I was using before.
'cause they were just using.
Four.
But not knowing any difference.
'cause it was really a poweruser who like understood that,
because of open AI's terriblemodel naming that 3.0 was
actually better than 4.0 or, oroh three was better than 4.0.
So I'm curious, how do you thinkabout like, educating and making

(38:03):
that simple for customers tochoose what's most best for them
given all that technicalcomplexity?

Yaniv Makover (38:09):
Yeah I totally understand your point.
I think there's a differencebetween B2C and B2B vendor.
Right, so you're

Dan Balcauski (38:14):
Mm-hmm.

Yaniv Makover (38:15):
like need to choose a model.
You don't really care.
Someone is better than theother.
we partner with a big enterprisemarketing team, they have like
CTO and they have like some,they have an understanding of
what they want to use.
And so we need to provide eithera private model the foundation
model of their choice.
They probably have a workingrelationship with with open ai,

(38:37):
with or with Google, whoever.
And so Don't concern ourselves.
Like, like use the model youwant.
It's not, we don't have the B2Cissue.
So I totally get your point.
I just I haven't had spentenough time in that space where
I have to, like, from myperspective, they're basically
the same.
Like there's no for for creatingcontent for marketing.

(39:00):
There's not a, there's not adifference between those models.
It's not something that you cansee.
I think for more complex tasks,yes, for sure you should use the
reasoning models.
But so, so the, it's just myexperience that all the models
are the same.
You probably have some sort ofbusiness concern or strategy
kind, like input going from someteam, and we'll just provide

(39:21):
whatever you want.

Dan Balcauski (39:24):
Well, there's a whole bunch we didn't get a
chance to get to, but I wanna berespectful of your time and the
audience time.
I can talk to you all day, but Iwanna wrap it up with a couple
of rapid fire closeoutquestions.
Is that okay?

Yaniv Makover (39:32):
Yep, go ahead.

Dan Balcauski (39:34):
Awesome.
Well, so, when you think aboutall the spectacular people
you've had a chance to workwith, is there anyone that just
pops to mind who's had adisproportionate effect about
the way you think about buildingcompanies now?

Yaniv Makover (39:44):
One of my first investors, our first investor is
a guy that, his name is Gil.
He's like a, he runs a startupaccelerator.
and this is like my firstcompany and he just basically
taught me like everythingaround, founding company.
Like, I spent six months in, inMenlo Park, in Palo Alto

(40:06):
everything I knew, and I was asoftware engineer, like, and I
did a master's degree beforethat.
Before that, my whole worldchanged.
Like everything I knew about howto build a company.
A business, how to bringadvisors, how to sell to
customers.
You have to have your own likeenergy and drive, but if, if, if
you don't know the people thatwill just, you how, what's the

(40:27):
blueprint.

Dan Balcauski (40:28):
Hmm.

Yaniv Makover (40:28):
like very, like maybe in Austin and where you
are at and in, in Palo Alto, itwas like everybody knows this
stuff.
But it's not common knowledgeeverywhere else, like the rest
of the world is like, how do youdo this?
So yeah he I can, I credit a lotof like what we, we've achieved
to him.

Dan Balcauski (40:44):
Was there any one thing that sticks with you all
these years later?
The of that he like was, maybeit's cliche now, but it was
surprising at the time and theway you thought about building
companies.

Yaniv Makover (40:53):
I think, people always ask, should I quit my job
and start this company?
Right.
Are you familiar with thisquestion?
Like people

Dan Balcauski (41:00):
Yes.
Yeah.
Yeah.

Yaniv Makover (41:00):
as a founder, and I kinda the answer is, I don't
know.
I barely know enough about myown business.
But what I do think is a clichemaybe is.
Do you wanna work on thisproblem?
'cause you're gonna spend like10 years on this.
Like, is this like something youwanna work on?
Because this is not gonna belike a, and you're gonna
probably change your solution,your product, like 20 times.
You're gonna pivot.
Do you really want to to work onit?
And because stamina iseverything.

(41:23):
It's a cliche, but really like,it's like, there's so many bad
things are gonna happen to thecompany and to you and you
really need to like, shrug itoff or it's gonna.
You're just gonna go you're,it's easier just getting a job,
right?
It's like, it's just mucheasier.
So, So you, you

Dan Balcauski (41:40):
yeah.

Yaniv Makover (41:41):
that a lot.

Dan Balcauski (41:42):
Well, y this has been fantastic.
If listeners want to connectwith you, learn more about
Anyword, how can they do that?

Yaniv Makover (41:47):
So I need Mark over at you can find my LinkedIn
Anyword.com yeah.
So pretty easy.

Dan Balcauski (41:55):
Yeah.
Well, I will put those links inthe show notes for our listeners
everyone that wraps up thisepisode of Sask Galy Secrets.
Thank you Nadi for sharing hisjourney insights For our
listeners, if you found anyinsights valuable, please leave
a review and share this episodewith your network.
It really helps the podcastgrow.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Burden

The Burden

The Burden is a documentary series that takes listeners into the hidden places where justice is done (and undone). It dives deep into the lives of heroes and villains. And it focuses a spotlight on those who triumph even when the odds are against them. Season 5 - The Burden: Death & Deceit in Alliance On April Fools Day 1999, 26-year-old Yvonne Layne was found murdered in her Alliance, Ohio home. David Thorne, her ex-boyfriend and father of one of her children, was instantly a suspect. Another young man admitted to the murder, and David breathed a sigh of relief, until the confessed murderer fingered David; “He paid me to do it.” David was sentenced to life without parole. Two decades later, Pulitzer winner and podcast host, Maggie Freleng (Bone Valley Season 3: Graves County, Wrongful Conviction, Suave) launched a “live” investigation into David's conviction alongside Jason Baldwin (himself wrongfully convicted as a member of the West Memphis Three). Maggie had come to believe that the entire investigation of David was botched by the tiny local police department, or worse, covered up the real killer. Was Maggie correct? Was David’s claim of innocence credible? In Death and Deceit in Alliance, Maggie recounts the case that launched her career, and ultimately, “broke” her.” The results will shock the listener and reduce Maggie to tears and self-doubt. This is not your typical wrongful conviction story. In fact, it turns the genre on its head. It asks the question: What if our champions are foolish? Season 4 - The Burden: Get the Money and Run “Trying to murder my father, this was the thing that put me on the path.” That’s Joe Loya and that path was bank robbery. Bank, bank, bank, bank, bank. In season 4 of The Burden: Get the Money and Run, we hear from Joe who was once the most prolific bank robber in Southern California, and beyond. He used disguises, body doubles, proxies. He leaped over counters, grabbed the money and ran. Even as the FBI was closing in. It was a showdown between a daring bank robber, and a patient FBI agent. Joe was no ordinary bank robber. He was bright, articulate, charismatic, and driven by a dark rage that he summoned up at will. In seven episodes, Joe tells all: the what, the how… and the why. Including why he tried to murder his father. Season 3 - The Burden: Avenger Miriam Lewin is one of Argentina’s leading journalists today. At 19 years old, she was kidnapped off the streets of Buenos Aires for her political activism and thrown into a concentration camp. Thousands of her fellow inmates were executed, tossed alive from a cargo plane into the ocean. Miriam, along with a handful of others, will survive the camp. Then as a journalist, she will wage a decades long campaign to bring her tormentors to justice. Avenger is about one woman’s triumphant battle against unbelievable odds to survive torture, claim justice for the crimes done against her and others like her, and change the future of her country. Season 2 - The Burden: Empire on Blood Empire on Blood is set in the Bronx, NY, in the early 90s, when two young drug dealers ruled an intersection known as “The Corner on Blood.” The boss, Calvin Buari, lived large. He and a protege swore they would build an empire on blood. Then the relationship frayed and the protege accused Calvin of a double homicide which he claimed he didn’t do. But did he? Award-winning journalist Steve Fishman spent seven years to answer that question. This is the story of one man’s last chance to overturn his life sentence. He may prevail, but someone’s gotta pay. The Burden: Empire on Blood is the director’s cut of the true crime classic which reached #1 on the charts when it was first released half a dozen years ago. Season 1 - The Burden In the 1990s, Detective Louis N. Scarcella was legendary. In a city overrun by violent crime, he cracked the toughest cases and put away the worst criminals. “The Hulk” was his nickname. Then the story changed. Scarcella ran into a group of convicted murderers who all say they are innocent. They turned themselves into jailhouse-lawyers and in prison founded a lway firm. When they realized Scarcella helped put many of them away, they set their sights on taking him down. And with the help of a NY Times reporter they have a chance. For years, Scarcella insisted he did nothing wrong. But that’s all he’d say. Until we tracked Scarcella to a sauna in a Russian bathhouse, where he started to talk..and talk and talk. “The guilty have gone free,” he whispered. And then agreed to take us into the belly of the beast. Welcome to The Burden.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2026 iHeartMedia, Inc.