Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Hannah Clark (00:00):
It's official.
Vibe coding has entered thevernacular of the gen-pop,
and we are entering a criticalmoment in which AI, digital
products, and culture arecoming together in one
giant evolutionary leap.
Democratizing access to digitalproduct development, or in
simpler terms, 'building onvibes', marks a permanent
(00:20):
leveling of the playingfield in which literally
anyone can become a founder.
And while eliminating barriersto entry will mean that
more great ideas will havea pathway into the market,
it also means that, well,more great ideas will have
a pathway into the market.
Like, a lot more.
And at this scale, theimplications of that are
so much bigger than justhaving way more competition.
(00:42):
It means that if we hope tobe real contenders in this
market, we need to hold ourGTM and growth strategies
to a much higher standard.
And lucky for you, my guesttoday is Margaret-Ann Seger,
Head of Product at Statsig.
As you'll hear shortly,Margaret-Ann, or MA to
those who know her, has anincredible sense for what
breaks through the noise andgets growth engines running.
But the absolute goal you'llhear in this episode are the
(01:04):
tactics that she and the folksat Statsig use to enable
collaboration between theirusers and the product team.
You are absolutely goingto wanna take notes.
Let's jump in.
Oh, by the way, we holdconversations like this
every week, so if thissounds interesting to
you, why not subscribe?
Okay, now let's jump in.
Welcome back to TheProduct Manager podcast.
(01:25):
Margaret-Ann, thank you somuch for making some fire
to chat with me today.
Margaret-Ann Seger (01:28):
Thanks for
having me on the show, Hannah.
Hannah Clark (01:30):
So can you first
tell us a little bit about your
background and how you got towhere you are today at Statsig?
Margaret-Ann Seger (01:34):
Sure.
So I lead product and designat Statsig, which is the modern
product intelligence platform.
So from traditional ABtesting through to, offline
model testing, increasinglyteams are building with
models, smart featureflagging, product analytics.
We help teams incorporatedata at every part of
their product buildingprocess in this age of AI.
So prettycomprehensive platform.
(01:55):
For me.
It resonates a lot because Icame from bigger tech companies.
My started my career, wasat Big Tech, started at
Facebook, and then spent alittle over six years at Uber.
Both companies were madehypergrowth when I joined,
but I had the luxury ofhaving this suite of super
awesome internal tools to use.
And it wasn't until I leftin 2020 and popped out
in the real world that Irealized not every company
(02:17):
has access to those tools.
And so what we're buildingat Sadig resonates because
we're democratizing accessto those same set of
tools for every company.
Hannah Clark (02:24):
That's awesome.
Today we're gonna be talking alittle bit more on the growth
side of things, how to nailgrowth and GTM strategy in
speaking of democratizing inthe age of AI when so many parts
of the product development andlaunch process are democratized.
We have to really figureout really smart, intuitive
ways to stand out.
So let's start out byreferencing a line that you
shared with me in a previousconversation, which I
(02:46):
really liked, which is firsttime founders care about
product, repeat founderscare about distribution.
I think that's a very succinctway to frame the conversation.
So when you think about that,what's the most important
mindset shift you think thatproduct leaders need to make
when we're transitioningfrom a product first to
distribution first thinking.
Margaret-Ann Seg (03:01):
Good question.
So this actually, myhusband told me this line.
He was telling it tome in the process of
lining out his startup.
So it was very timely tohave that conversation.
But I think, the spirit ofit is that at the end of the
day, you can build the mostamazing product in the world,
but if no one knows about itor you have no way to get it
in front of the right people,you're not gonna be successful.
And I think this is especiallytrue in the age of AI, right?
(03:22):
Because there's just somany more products, right?
The bar for creating aproduct has gone down.
Now everyone can createtheir own app, their own
website, their own X, Y,Z. And so there's gonna
be so much more noise.
How do you stand out?
How do you actuallyget distribution?
And so I think, it's a coupleexamples that I often think of
when I think of this done right,are things like, I don't know if
you followed, this is not likea tech product example, but I.
(03:45):
Haley Bieber's makeup linewas recently acquired Road.
And it was acquired forover a billion dollars and
super successful outcome.
And that was all builton just her brand.
She's probably good at makeup,but I'm sure there are a million
and a half people who know moreabout the nuances of creating
makeup and all that stuff.
But she has a phenomenal brand.
She had distribution.
You can light that up like this.
I think back to my time atFacebook, we had amazing
(04:07):
distribution and so any featureyou launched could go from zero
to hundreds of millions of usersovernight because you just.
Had a giant platform withalready a huge adoption.
And I think like evenyou're seeing it in the
AI companies, right?
Like cursor piggybackedon top of VS.
Code.
There's all these like justexisting behavior and existing
distribution channels thatpeople can then build something
(04:27):
incremental on top of andovernight, boom, light it up.
I think that is going todifferentiate the companies
that are really successfulis can they figure out a
sustainable and creativedistribution channel?
And it's not just gonna be aboutdo you have a good product?
'cause that'll almost betable stakes in the age of AI.
Hannah Clark (04:43):
Yeah, and that's
exactly, I think the concern
that's on a lot of folks' mindis, you might have a great
idea, but the functionalityof a product that you're
creating isn't necessarily,it's not without, there are
many paths to that same outcome.
So if we're thinking aboutstrategies for differentiate
great products from thecrowd, knowing that anyone
can create the same app thatyou can create, what are
some of the ideas that youwould suggest people start
(05:04):
with when they're trying to.
Make their place in the market.
Margaret-Ann Seger (05:07):
Yeah.
I think an important one thatdoesn't get talked about as
much is actually feedback loops.
You might have an AIassistant writing your code.
You might have, just assistanceat every step of the product
build and launch process.
But then how are yougetting feedback once it's
actually out in the world?
And both qualitative andquantitative feedback, right?
You need to know quicklyis what you're building
actually working and just.
(05:29):
Get those kind offeedback loops humming.
So you're constantlygetting signal and
iterating accordingly.
And then I think to thatpoint, the differentiator
for many people, and actuallyat stats, like we see
this as a differentiatorfor us, is speed, right?
So once you get those feedbackloops up and running, are
you able to incorporate thatfeedback and iterate faster
than the next person, right?
And so if your users aretelling you something, can you
capitalize on that immediately?
(05:51):
And so I think that feedbackloop plus speed are really
gonna differentiate the winners.
And I think one thing thatwe've seen is we actually
will even put something outthere that isn't fully built
that's really rough around theedges, just to get signal so
that then we can inform whatwe actually built and shortcut
some of the like core productbuilding process by doing that.
Hannah Clark (06:11):
So on that note,
so really what this is about
is getting to know the usersand that like I think it's
always been critical and nowit's just like paramount.
So what's one tactical approachthat product leaders can use
to really understand theirICP right now, given this
democratizing landscape?
Margaret-Ann Seger (06:25):
Yeah, so
this is gonna be controversial,
but actually do support.
I think PM's doing supportand working with customers
when they're getting problemsis actually really helpful
if you want to understandyour user, which I think
to your point is gonna bemore important than ever.
We take this to the extremistat Statsig and it blew my
mind when I first joined.
But the, we don't have a supportteam, like the entire team.
(06:48):
The entire company isresponsible for support and
we've actually spun up a wholesuite of complex tooling with
Slack feedback groups thatauto triage via a bot that
like go into the on-call ofthe days queue and every team
member becomes on call ofthe day on a rotating basis.
I'm head of product,but I also do support.
And so I hear whenthings are going wrong.
I see when people are gettingstuck on a certain flow and
(07:10):
I think that actually letsme just stay in constant
contact with the customer andtheir pain points and just
like channel that in everyconversation automatically.
And I think this iscontroversial because there's
so many companies, Sierra,and even just all these AI
customer support bot companiesthat are basically their
premise is you shouldn't behaving to do these questions.
We're gonna offload this to.
(07:32):
A bot or the AI, and Ijust think that it helps
you really stay in touch.
And I also think that yourcustomers appreciate it.
It almost becomes apoint of differentiation
in this new world.
Hannah Clark (07:42):
Yeah, and this
actually blowing my mind.
I think it's actually abrilliant strategy because
whether you're using abot to answer customer
support questions oryou have a support team.
If you're keeping that separatefrom the development process
of the folks who are reallycloser to the product, you're
telephoning that informationor you're beholden to making
sure that you're checkingin with the right people.
But I also think it's reallybrilliant because otherwise,
(08:03):
what are the other ways thatyou're getting feedback?
You're soliciting it from peoplewho either are really eager
to support a product becausethey love it already, or the
people who really hate it.
So being in the supportarea, it's not really
where people expect it.
You're going to take intoaccount their feedback.
They're just trying toget from point A to point
B and that middle groundI think is otherwise very
difficult to capture.
Margaret-Ann Seger (08:24):
Yeah,
it's really cool and I
think it, customers in theirreal world environment.
You meet them here in themoment, you see the emotion.
And it's cool too tosee like our engineers
building this empathy.
'cause I think it justscales really nicely, right?
It's not just the PMs who aredoing this, it's the engineers.
They get excited about,Hey, there's an opportunity
to improve this flow.
I'm just gonna golike ship a quick fix.
And so you almost can achievemore by just doing this
(08:47):
bottoms up empathy building.
Hannah Clark (08:49):
Yeah, absolutely.
And I can really see the valuetoo in being able to really
understand the use cases thatpeople have for the product in
the moment and understand, whatare they trying to achieve?
Should there be an easierway that they shouldn't
have to contact support?
This is, my brain isbuzzing, I love this idea.
I'm really glad that it came up.
Okay we'll move on.
So when we think about the speedof build cycles, that was the
(09:11):
other thing that you mentionedabout differentiating in terms
of being able to incorporateyour feedback into your
product roadmap very quickly.
So you suggested, scrappythings, build them
quickly, get them outthere, get some feedback.
So when you think aboutthat, what's a framework
that you'd use to decidewhat to build and test first?
If there's just manyoptions, many ways
that you can take that.
Margaret-Ann Seger (09:31):
I
alluded to this a little bit
earlier, but we've startedfilming prototype videos.
So doing the prototype andthen actually filming a video
as if it's a real productand saying it's open for
beta request if you want betaaccess, and we don't build it.
And so the customer reachesout, if they're interested or
someone sees it on LinkedInand says, oh, this is cool.
We should try, try this.
They'll ping us and ifwe get enough demand and
(09:53):
excitement, we'll go build it.
And so it's a very like lowcost way to gut check ideas
without having to go and builda full end-to-end product
and all the edge cases andjust like nuance that you
have to think about there.
We've tried this with a fewthings and it's worked out
really well, so I think we'rejust gonna keep doing this
and basically letting themarket tell us what we should
or shouldn't be building.
Hannah Clark (10:12):
This is such
a peek behind the curtain.
I feel like this is like aWizard of Oz reveal moment.
Margaret-Ann Seger (10:16):
Maybe
I shouldn't be saying this.
Hannah Clark (10:17):
No, I think
it's brilliant because there's
many things that you coulddo, but if you don't have the
excitement to your kind ofpoint about the distribution.
Why put all the effort intothe build cycle if you just
don't have that momentum.
Margaret-Ann Seger (10:27):
That's
the beauty of this is we get
built in distribution on dayone 'cause we know it's there.
We've already validated.
Hannah Clark (10:31):
Fantastic.
Okay, let's talk about the useof AI more in terms of getting
through your GTM strategy.
'cause I think that reallythe crux of this is how are
you gonna launch a productand be successful when there's
so many products just likeyours that are launching every
moment and a lot of people arethinking, I'll just ChatGPT, it.
GTM strategy.
I can just prompt that soyou're laughing and I Yeah.
(10:54):
To say no, I'm laughingbecause you're right, you're
a hundred percent right.
So if you think about,what is the missing piece?
What are people missing bytaking that strategy that we
really need to think aboutthat can't be automated, that
we really need to do manually?
Margaret-Ann Seger (11:08):
So look,
you can definitely use, I don't
wanna rag on Chad g Bt, you candefinitely use Chad g Bt for
a lot of the concrete outputs.
Knowing what those outputsshould be and how to frame them
is like an art in and of itself.
And there's the whole memeabout PMs are basically gonna
become prompt engineers.
Like I do think you can usethese tools for your GTM
strategy, but you need tobe prompt engineering and
(11:30):
really guiding that process.
The reason there needsto be human doing that
is it's about empathy.
At the end of the day, your GTMstrategy's only going to work.
If you have deep empathy foryour customer, you understand
the pain they're goingthrough, you're able to speak
their language and you'reable to position something
that you know is gonnameet them where they're at.
And I think like all sorts ofinputs are needed there, right?
(11:50):
You need to havequantitative inputs.
So understanding what yourusers are or aren't doing,
they might tell you something,they might do something else.
Qualitative inputs, youshould watch their sessions.
You should be knowing notjust what they say, but like
actually what are they doingbehind the scenes and where
are they getting hung up.
And I think that AI canactually help in both of
these processes, right?
There's a ton of startupsright now exploring AI driven
(12:11):
session replay synthesis, whichI think is super clever, right?
'cause it's like you recordthousands, hundreds of thousands
potentially of sessions.
You can't watch all of those.
You would like someone ora bot to go through and
glean the key insightsthere and push those to you.
But it still takes a humanto take those insights.
Know how humans behaveand who their ICP is
and what motivates them.
And then put those twoand two together to then
(12:33):
figure out the framing.
Once you have that framing, Ithink you can use ChatGPT to
say, Hey, here's the raw inputs,here's what I'm thinking.
Here's the customer.
Can you help me positionthis a little bit better?
But there's a lot ofpre-work needed there.
Hannah Clark (12:46):
Let's get
into the nuts and bolts
of a strong GTM strategy.
'cause I think thatthis is another area.
If you've got an enormousamount more competition than
you did before, everythinghas to be stronger.
And in particular, your GTMstrategy now versus five years
ago has to be way more robust.
So if we were to do a, like ananatomy lesson on the perfect
GTM strategy now versus whatwould pass for a great strategy
(13:09):
in 2020, what are the hallmarkswould you say of a really
good today facing strategy?
Margar (13:17):
This is a tough question
because I don't know.
It's tough to define likea great strategy today.
It is very easy to saywhat we used to do that
it no longer scales.
So I'll start there.
I think and we've seen thisevolution even, I joined
Statsig early 2022 and inthe last three and change
years, we've seen this happen.
A couple things havecompletely shifted.
One is, SEO is very differentnow as models and people
(13:40):
just bypassing Googleand going direct to the.
ChatGPT or Claude or Gemini,it's, you're not gonna get
much bang for your buck on SEOonce critical mass moves there.
The other thing is like a muchhigher bar for differentiation.
So it used to be, for example,high quality content was enough
of a differentiator, right?
If you put out a great blog orif you had a great newsletter.
Hannah Clark (14:03):
Or podcast.
Margaret-Ann Seger (14:05):
Yeah.
Yeah.
Yeah, all these thingswere like enough, and now
I think it's almost liketable stakes in a sense.
Especially, I thinkwritten content is becoming
increasingly fraught becauseit's just so easy to generate,
like AI slop essentially.
And so how are you writingthings that cut through that
noise, especially when peopleare generating like hundreds
and hundreds of AI blogsto just rank better, right?
(14:27):
In search.
The other thing is likeactually, and this is
shifting from just.
What's no longer good into whatI think the gold standard is.
I actually think, and I'vetouched, I feel like I've hit
on this point in a number ofdifferent ways, but I actually
think like human touch isgoing to be the gold standard.
Actually having a human inthe loop is almost gonna be
revolutionary and people aregonna naturally gravitate to
(14:49):
the brands that feel more human.
It's funny, someone was tellingus, like A friend was telling
us they thought Theas brandwas really like different
than your typical AI company.
And I was like, why?
Quantify that for me, and theywere like, oh, there's just
so many people you guys likehave your employees out there.
You have videos with yourPMs launching things.
You're like, have facesof individuals attached
(15:11):
to your blog posts.
You're very people first.
And that's actually edgyin today's world, right?
You look at the versel, thelinears, like these super like.
Clean, minimalist, almoststerile brands, and
that's the gold standard.
And it almost feels likemessy and unconventional
to have people have youremployees out there actually.
But we've found that alot of people love it
(15:33):
and I like it, right?
It's like we have our dogson our website, we have,
the people behind theproducts, not just filming
videos, but also answeringyour support questions.
And I think that has actuallybeen a differentiator for us.
As everything, peoplejoke about, what is it?
The the gray floors, whitewood aesthetic in houses and
everything's just gone to that.
I think this will be similar intech where there's an aesthetic
(15:56):
that everyone converges on, andif you actually are different
than that or memorable insome way, I. That's going
to be an edge for you.
Hannah Clark (16:01):
Yeah.
I think this is reallyconsistent with something that
I've been kicking around inconversations a lot lately about
how it just seems like the caseis with everything from tech
to fashion trends to just abouteverything, there's always a
pendulum swing where there's acertain saturation point where
people start to crave somethingin the other direction, and
it's almost proportionate towhere the saturation and how
saturated we're looking atlike a specific pendulum swing.
(16:23):
So I'm seeing exactly whatyou're saying, it used to
be that everyone reallygravitated towards those
really clean, minimalist, yeah.
Design and websites.
They're, yeah, and they'rebeautiful and very simplistic,
but they're also theeasiest to generate now.
And so I think it's thetransition from Instagram to
TikTok where Instagram reallypopularized this very polished,
very shiny content, and thenpeople started to really crave
this lo-fi aesthetic where itwas very hands-on people first.
(16:46):
And I think we're seeing asimilar kind of thing where
people are like, no, I reallywant evidence that someone,
people are behind this.
Yeah, I think there'ssomething there.
So if we're talking aboutgrowth teams using tools
like at Statsig, what wouldyou say is the biggest GTM
mistake that you see teamsmake when they have access to
a lot of different technologyfor measuring success?
(17:07):
There's a lot of data toparse through and some easy
bad correlations to make.
How do we kindamake sense of that?
Margare (17:13):
Yeah, it's interesting.
Increasingly teams are adoptingtools to be more data driven
and to log data and to, just.
Have that input.
The problem is a lot of thetimes the inputs don't line up.
So if you have four differenttools and they're all logging
similar actions that auser's taking, you're then
having to parse throughfour different data sets,
four different data sources.
(17:34):
They rarely agree.
It gets super messy.
Then people getfrustrated because the
data doesn't line up.
You don't know what to trust.
And so it ironically, like themore you add on these tools
and try to be data driven, theharder it becomes to actually
be properly data driven.
A big trend in the industryto meet this problem has been
consolidation and like theseplatforms being built that start
(17:55):
to combine multiple tools andlike we're one of them, right?
So I'm not saying that thisis the Statsig unique thing,
but having one set of SDKsthat are logging things.
One source of truth dataor a source of truth data
in your warehouse, right?
Like increasingly companieshave a data warehouse that
is their source of truth.
That's great.
Like all these tools shouldbe building on top of that.
They shouldn't be trying tolike create new data sources.
(18:17):
So you're seeing that across theboard in our industry at least.
And I think that actuallyhelps a lot once teams can
unify on this stack on asource of truth data set.
'cause they start to trustdata again and use it.
But it's almost like aslow rebuilding process.
'cause people have losttrust in data over time.
So I think that's a big one.
The other thing is, you'd besurprised how many folks say,
(18:40):
yeah, we were data-driven.
And you're like, okay, cool.
How'd that feature perform?
And they're like.
I don't know, and that's becauseit's actually hard to launch
new things as experiments.
It becomes this big thing.
We have to set up theexperiment, what's
our design doc?
Does the DS team agree withhow we're setting this up?
Then we launch it and wedo a big summary process.
The tools that make this processjust really lightweight, you're
(19:03):
just launching a feature flag,you're training on and off,
and it automatically creates alittle ab test of the people who
have the feature and the peoplewho don't have the new feature.
You can quickly gut checkthat this isn't taking your
business metrics or yourlatency and infra metrics or
spiking your cost metrics.
If you can just quicklygut check that and go, and
you just lower that barrierof entry for AV testing.
(19:24):
Really lightweight AB testing.
I think that's powerful.
Is it going to be the properend-to-end process and
have all these kind of moreadvanced stats methodologies?
Maybe not, but I think it's areally good entry for teams to
just start building that muscle.
So that's one thingwe've been working on is
just making that easier.
Hannah Clark (19:39):
Cool.
So speaking of building thatmuscle, so a lot of what
we're talking about hereis like the combination of
the mindset of developmentas well as the mindset of
like, how is this, how arewe gonna distribute this?
And this is something I thinkeverybody in the product team
now has to internalize as wereally have to have distribution
and development kind ofin mind at the same time.
So how do we get confidentin building that kind of
(20:00):
a skillset and a mindset?
Is there, I don't know if atStatsig, you guys have that
kind of a mindset throughoutyour team now, or is there
something that you'veencouraged for folks who aren't
necessarily directly involvedwith the product marketing
process, but like still needto be thinking about how
this is going to be marketed?
Margaret-Ann Seger (20:15):
It's
tough to spread that DNA.
One thing that I think isreally cool, and this is more
a like who you hire, DNA, yeah.
But we try to hire really likeproduct minded engineers who
are passionate about the enduser, who are passionate about
the kind of performance ofthe product, and they wanna be
involved with the marketing.
They wanna be, helping,understanding who's actually
(20:37):
adopting this product.
Is it who we'd expect?
Are they using it inthe way we'd expect?
Do the metrics look good?
And so that, that hasbeen cool to harness.
Concrete example of that isour infra team, they've been
working on a set of toolingthat's similar to Datadog in
a sense, because they wannadog food our product, and they
wanna use our product more.
And I was like, that'sgreat, that's awesome.
(20:57):
And so they've gone really fardown this rabbit hole, and now
at this point they're debuggingcev on Sig and they're doing
the end-to-end, like monitoringand alerting and infra
health day-to-day on Statsig.
They came to me and they'relike, Hey, we're using
the product for this.
Could we market to customers?
Could we just sell this?
Could this be a skew?
I was like, yeah, let's do it.
And obviously, it's rougharound the edges because we're
(21:18):
our main customer right now.
But that's a really cool wayto just come at the problem.
You have a pain point,you solve it for yourself.
You realize other peoplemight benefit from this
and you say, Hey, let'sgo actually market this.
Now the engineering lead onthis area is super involved
with like myself, our PMM, ourmarketing team in kind of the
positioning for the product.
Hannah Clark (21:36):
Yeah.
Yeah.
That kind of, I think, putsreally a clear case on how
you launch and how you thinkabout distribution is so
nuanced and specific to thestory of the product where
you're finding value in it.
I would love if wecould on another story.
'cause I'm liking this idea,this kind of train of thought
of how important it's going tobe for storytelling and like
the human aspect to be frontand center in how we distribute
our products in the future.
(21:58):
Have you seen a really goodexample of that recently?
Of a product that has reallywell leveraged their team?
Who would you shinethe spotlight on as a
really good example?
Margaret-Ann Seger (22:07):
Yeah,
so this is not recent.
This was I think 2018.
My gold standard that Ialways come back to when I
think of just best in classGTM was actually at Uber.
We did in 2018, there waslike a six month period where
they did driver forward.
So for context, I think,Uber grew super fast.
(22:27):
Drivers were a huge partof that, but they often
felt, I think like a veryunappreciated part of that.
There was a lot of storytellingon like the rider use
cases that were unlocked.
How this was helping peopleget home safely from the bar
or helping mothers get, theirkids to soccer practice.
But there was less storytellingabout the drivers behind
the wheel who were earningincome in new ways, who were
sending their kids to college,who were doing all these
(22:48):
really cool things becausethey had this opportunity.
And so there was a wholebacklog of driver requests
and features and quality oflife things, and just like
earnings visibility, like allthese really core workflow
things that drivers havebeen asking about for years.
We, as a business feltlike it was time to shine
the spotlight on drivers.
(23:08):
And so we said, why notcombine these two things?
Driver forward was basicallythis concept where every month
for six months, the marketingand product teams combined did
like a moment and the month hada theme and it would be one net
new, feature release or kind ofimprovement to the driver app.
Paired with a ton ofstorytelling and like even
driver sessions where theywould invite drivers to the
(23:30):
launch, there'd be a launchparty, it would launch in a
different city at a differentdriver onboarding center.
And it was super drivercentric and it really
worked like, it was cool.
It was a big pivotin the trust dynamic
between drivers and Uber.
And it was a total masterclassin having product and marketing
teams basically orientedentire product roadmap
(23:51):
around these GTM moments.
And so the woman whodrove a lot of this was.
Laura Jones, who's actuallythe CMO at Instacart, and
she's just phenomenal.
But I think that particularexample is for me, the gold
standard of how to take youruser, take their pain, put it
at the center of what you'regonna do, build a whole roadmap
around it, and align marketingand product super tightly.
Hannah Clark (24:13):
Oh, okay.
I love that.
That's such a great example.
And I'm very succinct way toframe like what that mindset's
supposed to look like in action.
Margaret-Ann Seger (24:20):
It
was really cool to see.
Hannah Clark (24:21):
Yeah.
Thank you so muchfor joining us today.
This was a great conversation.
I think we really hit so manygood points in such a like
good tight amount of time.
If people wanna continueto follow your work, where
can they find you online?
Margaret-Ann Seger (24:32):
Yeah,
so LinkedIn, I go by MA, but
my full name is Margaret-AnnSeger and also the staffing
blog is pretty cool.
We talk about a lot of coolthings and like I said, it's
very people-centric, if youwanna get to know the team
better, statsig.com/blog.
Hannah Clark (24:45):
Thanks so much.
Margaret-Ann Seger (24:46):
Thank you.
Hannah Clark (24:49):
Thanks
for listening in.
For more great insights,how-to guides and tool reviews,
subscribe to our newsletter attheproductmanager.com/subscribe.
You can hear more conversationslike this by subscribing to
The Product Manager whereveryou get your podcasts.