Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
it's cheaper to
prevent issues and to apologize
for them.
So if we're able to actuallyreview ahead of time everything
that is currently wrong with oursupport process, we can work on
that, we can improve it, andthen maybe it will cost some
time and some money to actuallydo that, but it will save us a
lot in the future.
Speaker 2 (00:21):
Welcome back to Live
Chat with Jen Weaver.
I'm so glad you're here.
Today.
I'm sitting down with ChloeKurz-Barat, the mastermind who
turned a fast-growing ad techsupport team into a 97-plus CSAT
powerhouse.
We'll unpack how her tiny QAsquad reviews just a handful of
(00:42):
chats each week, yet drivesproduct change, why QA to save
the day is more than just hermotto, and how she keeps a very
human touch, even while lettingAI handle the heavy lifting.
We also dig into what it waslike for her team when the
support QA tool, klaus,eventually stopped working after
(01:04):
it was acquired by Zendesk.
That's something I've wonderedabout for a really long time.
So whether you're launchingquality from scratch or you're
leveling up an existing program,I hope this podcast is full of
practical ideas that you canactually steal and use.
Before we get started, though,our QA tool, supportman, is what
(01:24):
makes this podcast possible.
So if you're listening to thispodcast, head over to the
YouTube link in the show notesto get a glimpse.
Supportman sends real-time QAfrom intercom to Slack, with
daily threads, weekly charts anddone-for-you AI-powered
conversation evaluations.
(01:45):
It makes it so much easier toQA intercom conversations right
where your team is alreadyspending their day in Slack.
All right on to today's episode.
We're going to talk about QA,but first I heard that you have
a really unconventional path tosupport.
Can you tell us a little bitmore about that?
Speaker 1 (02:06):
Yeah, so thank you
for having me.
I started as a customer successmanager and after almost like
five years overall with myprevious roles in a customer
success manager position, and wehad this amazing value that was
everybody does support.
So as a customer successmanager we would have to be in
support, handle customer chats,and I kind of fell in love with
(02:28):
it, really enjoyed solvingcustomers' issue and
understanding the root cause andhow we could get there so that
it wouldn't happen again.
At the time the supportoperations team lead was a
French colleague.
The support operations teamlead was a French colleague.
Then I talked to him likealmost every week about
(02:48):
different support topics and hesaid, like I really want to have
you in my team and I said, wellthen I'm going to join.
So here I am Four years later,joined right after COVID.
Speaker 2 (02:59):
Four years later I'm
here and enjoying it very, very
much, and so your role issupport operations, is that
right?
So support operations is a teamthat I'm here and enjoying it
very, very much, and so yourrole is support operations, is
that right?
Speaker 1 (03:07):
So support operations
is a team that I'm part of, and
I'm mainly focusing on QA tasksand then internal processes
between the support operationsand all the other teams within
our company.
Speaker 2 (03:18):
That's great.
As you know, we're hugeproponents of QA because that's
what SupportMan, our tool, does,and so I love that we get to
dig into that further.
I just recently did a talk atthe Support Driven Expo about QA
, so this is following on thatreally really nicely.
But before we get into yourwork with quality, I would love
(03:43):
to hear about what does a weekin your life look like?
What's your typical week.
Speaker 1 (03:49):
I would say number
one focus, other than actually
QA, is being in support.
So we still have support shifts.
I am still in support handlingchat, facing customers and
working on our internalprocesses through actually the
support chat.
So I have free shifts of fourhours every week that I need to
(04:10):
be in and handle our customers'cases.
During those support shifts Iusually also like train new
joiners and help them to getonboarded within the support
role overall and handle thechats, and then outside of that,
so that takes, let's say, threemornings out of my week and
then outside of that I'mfocusing on QA review in
(04:34):
specific chats and I'm actuallyright now working on an AI
project for QA.
So that's a big project that istaking a lot of my time and in
terms of meeting, I have a lotof meetings with product
managers, but also engineeringleads to make sure that the
support process overall is goodfor them in how we are
(04:56):
escalating cases, how we'resending feedback, what can be
improved and specific topicsthat we need to focus on or
roadmaps to understand exactlywhat is about to come for our
customers.
So we can also be preparedwithin our team to train our
support agents or to get all thedocumentation ready and then
(05:18):
internal meetings as well withthe rest of the team.
Because we are a global team,we have people based in well, I
can give you all the locations,but we have Manila, singapore,
dubai, helsinki, berlin I'mbased in Madrid and then we have
New York, chicago, guatemala,and we also have somebody in San
(05:39):
Francisco.
Speaker 2 (05:40):
So we're like
everywhere For sure.
Speaker 1 (05:44):
Which helps us cover
24-7 support.
But yeah, I'm juggling withdifferent working hours so I can
be like talk to everybody andmake the most out of it.
Speaker 2 (05:55):
You mentioned to me
that QA helps you identify
broken processes.
That might be a good place tostart.
Speaker 1 (06:01):
I would say one of
the main processes I actually
started working on when I joinedthe support operations team was
the technical escalationBecause, as I said before, we
had this value in the companythat everybody does support.
We would have engineers besitting in support with us and
just handle some checks.
The problem is that within ourplatform we have so many
(06:22):
features and so many parts ofthe tool that not all engineers
are familiar with everything.
They each have their specialfeature that they're responsible
for.
So we'd have a chat coming inabout, let's say, reporting, and
the engineering sitting insupport would be about creative,
so it wouldn't even know how toanswer those requests that we
had, and the escalation wasn'tvery quick and efficient.
(06:44):
So the first thing that we didis actually take engineers out
of support and see how we couldreach out to each individual
team better so that theescalation process would be a
bit more seamless.
So we started with what we callthe support tickets and we
integrated the JIRA processwithin our escalation process.
(07:06):
We had it already for our bugsreport, but now it was what we
call the support tickets.
So, coming directly fromsupport escalations, we revamped
all of this so that wheneverthey get a ping about.
A support ticket has beencreated for this specific
feature.
They know that it's for them,so that was time-saving for
engineers not to be in them.
So that was time-saving forengineers not to be in support.
(07:27):
So it was a huge save of money.
But also we're able to bettertrack exactly the issue that was
actually happening.
So our engineers are onlylocated in EMEA working time
zone, so mainly in Helsinki andBerlin.
But because our support is 24-7, our teams in the US or in APAC
have technical issues that theyneed to escalate and whenever
(07:49):
we did that before, we didn'talways have engineers to
actually help us because theywere not always in support.
So now we are able to justcreate a support ticket and
whenever they get online thenext working day, they just go
into their support ticket boardand just check exactly what was
creating during the night andthey're able to work on that
investigation and the issue thatwas reported.
(08:10):
And whenever they leavecomments, it just gets back to
America.
So we can also do a follow-upfrom EMEA Time Zone.
But that way we really have areal time tracking and task
tracking as well for them andfor our teams to make sure we're
giving the customers all theinformation based on the
escalation that has been done.
Speaker 2 (08:28):
That sounds like a
really great system, and I
wonder if the engineers arealmost relieved that they don't
have to be in support anymore.
Speaker 1 (08:34):
Oh yeah, so every
time we have new engineers
working, we tell them a littlebit about support, how it works
and also how the engineeringorganization is involved.
And one question that comesback, I would say, on every
session, is do I need to betalking to customer and be
customer-facing?
Because I don't want to.
I'm like no worries, it's notgoing to happen, we're going to
manage that for you.
(08:55):
So I don't mind handling thecommunication style towards our
customers and leaving theengineers just solve the issues.
They don't happen again.
Speaker 2 (09:11):
Yeah, as a support
person, I find that really
validating that customer supportis a skill that not everyone
has or wants to develop, and assupport people, we've maybe been
developing it for years andsometimes it feels like it's not
even a thing, but it really isa thing that sometimes other
people are even afraid to dealwith.
Oh yeah, yeah.
So I know you have a lot ofother cross-functional wins from
your QA system, but I wonder iffirst we can go back and talk a
(09:35):
little bit about the history ofQA on your team.
Speaker 1 (09:39):
The support
organization overall has existed
since the company was funded 12years ago.
When we started QA, we had 20,22 agents, something like that.
Yeah, now we're up to 37, 48.
We started experimenting withQA in 2021.
But based on the scale that wewere at and the size of our team
(10:02):
and the number of customersthat we had, from a leadership
perspective, it was still notthe right time to actually go
into QA.
We were still reviewing fromtime to time, but not very
thoroughly and not with aspecific detailed process and
category.
And then in 2022, my currentteam lead just told me like we
need to scale this because wehave been still for a long time
(10:24):
in terms of CSAT and we havenever gone above 97.
We're always around 96.5, 97,but never going above.
And we were starting to see aswell from leadership that it was
better if we could actuallyincrease that a little bit more
towards 98 than remaining at 97.
So we started prospecting aswell QA platforms to see how we
(10:45):
could do it, because in general,we were dealing with
spreadsheets.
We started using Klaus at thetime, but before it was acquired
from Zendesk QA and we'rereviewing between 100, 120 chats
per week.
It was very interesting at firstbecause when we were able to
pick chats based on specificcharacteristics, that we're
filtering in the class platform,we're really starting to see
(11:08):
some patterns that we couldimprove very easily, like, for
example, what we call visual aid.
So whenever we're passing oninformation to our customers on
a step-to-step basis, like whatyou need to do, we were
sometimes forgetting to justinclude a screenshot, like where
they could find a specificbutton or a screen recording
small GIFs, something like one,two, three seconds snippet, like
(11:31):
it doesn't have to be too longjust to set them so that we
could actually illustrate whatwe're telling them.
And by just doing that, westarted seeing customers were a
little bit more responsive,especially because we had a few
people in the team that got thebad habit of just sending a link
to the knowledge base, like youwant information, yeah, you can
find it in here.
(11:51):
And then we actually made thereflection like if customers are
reaching out to us through thesupport chat, it's because they
want to talk to somebody.
They don't want to go for aknowledge base article and just
read something that they couldfind there, obviously, but
nobody's telling them exactlywhat to do, where, to put it.
So we started changing somethinglike, for example, yeah, the
visual aid and the step-by-stepguide that we're sending, also
(12:13):
reviewing the welcoming messages, to have something a bit
standardized and not havesomething like super friendly or
over uh strict, like very, uh,cold, uh, because we have a live
chat with our customers.
We still wanted to be friendly,but not too friendly, not by
going like yo, you know, wewanted to be just like we're
here to help, we're like anextension of your team and we're
(12:37):
here to just guide you to howyou can solve the issue and help
you out for the process.
So the tone and welcomingmessages were also something
that we actually worked out veryeasily.
But I would say the biggestchange was the visual aid part.
Definitely, how did you identifythat problem?
At first, I started obviouslypicking the chat that got
(12:58):
negative ratings to understandalso where the frustration from
the customers was coming thesupport that we provided our
customers or if it was moresomething about the answers that
we gave them or the solutionthat was not correct or
something like that.
And we started seeing thatactually the solution that we're
providing were correct, werequite on point, but the way
(13:19):
we're communicating, it was verycold and just like do this, do
that, and not always showingthem where they have to go to
actually solve it.
Speaker 2 (13:28):
Yeah, so you started
out without a particular tool
and you were using kind of yourcommon sense and not always
showing them where they have togo to actually solve it.
Yeah, so you started outwithout a particular tool and
you were using kind of yourcommon sense to get some big
wins.
Did you see that CSAT change?
You said it was like 96, 97.
Speaker 1 (13:40):
In six months time.
After implementing Clouds, wealready saw an increase of, even
if it was just a few point inpercentage.
But we already saw that becauseon a monthly basis we're never
below 97.2, 97.3, which wasalready good, you see, just like
0.2, 0.3 points.
Not that much, but at leastwe're making a change.
And now for the past two yearswe've never gone below 97.5.
(14:04):
Or, if we have, it was specificmonth where we had a lot of new
integrations and releasescoming from external API, so we
had a lot of frustration comingfrom the customer.
But we have also had peak at98.9, which is just crazy good.
We were never expecting that,but year to date we have been at
(14:25):
97.6 and we haven't gone down.
It's only increasing.
Speaker 2 (14:29):
You mentioned Klaus
and how wonderful that was for
your team.
I also really loved using Klaus.
Can you tell us a little bitmore about how long you used it
and what that was like?
Speaker 1 (14:39):
Yeah, so we had and I
have to be very honest, I don't
want to call them out publicly,but we had an amazing
onboarding team.
When we joined Klaus, I had twopeople One was based in Malaga,
in Spain, and the other one inAmsterdam and we had weekly
calls.
Like it was amazing.
They really guided me throughthe platform, they helped me
create all the differentcategories and scorecard that we
(15:01):
had, gave me, like so manyideas on what we could review
and how we could really use atool and take the most out of it
.
So our onboarding with Klauswas just amazing and we had, I
would say, one really good year.
And then, unfortunately maybeit's only my point of view, but
Klaus was acquired by Zendeskand the follow-up and the way
(15:26):
that we were treated, if I cansay not that it's negative, but
the way we were handled, ourcompany was handled from then
this was not the same and theexperience was a bit different.
But also the tool started tolack of a few things that we're
very interested in, a fewfeatures starting to go away,
which was not optimal for us.
So this year we decided becausewe're still in contract, but we
(15:49):
decided that we're going toshut down actually the QA
program with them, because we'realso internizing all of it and
because internally we're pushingfor a lot of AI, we're going to
see if there's a way that wecan use AI to automate our QA
process.
That's great.
Speaker 2 (16:03):
So how does your QA
process work when?
Speaker 1 (16:07):
we were using Class,
we were reviewing between
100-120 chats per week, so on amonthly basis, roughly 400 chats
more or less.
And obviously that rate wentdown when we started using the
platform and what we prioritizedmainly is obviously the
negative reviews that we'regetting from customers.
(16:27):
So I'm very lucky to be honestin that, because whenever I get
a negative rating and it'sduring email working time zone I
can just jump in the supportchat and just talk to the
customer like hey, I'm part ofthe QA team, I review the chat
and I just tell them okay, I see, my colleagues help you with
this.
I actually had a case like thisthis morning, so it's very
fresh.
I actually had a case like thisthis morning, so it's very
(16:48):
fresh.
But I just went over the chartand said okay, I see that,
michael, they actually offer youa solution.
I was very thorough inexplaining it, so I'm trying to
understand why it was a neutralrating, so three out of five.
But I'm trying to understandwhat we could do better.
And the customer just told meit's not about the person who
helped me in support, but moreabout the platform, because it's
a limitation on our end.
(17:08):
Like good job for the supportagent, but it's definitely
something we need to improve onour end.
So that was also an opportunityto pass on feedback to our
product team.
But, going back to the originaltopic, to choose a chat so
negative ratings is the firstchat that we're going to review,
and then what I'm trying to dopersonally right now is to pick
(17:29):
some of our recent joiners inthe team and just go over their
chats.
So I just want to see ifthere's a way that we could
standardize a few things.
So reviewing, yeah, like, forexample, the visual aid, the
tone, the communication wheneverthey're escalating, things to
engineers, like what they'redoing and they're missing
information, and things likethat.
So I'm picking those chatsfirst.
(17:52):
So I'm not reviewing more than10, maybe 20 chats per week.
So the drop is huge and we'reaware of this.
But we're really focusing on,instead of having a quantity of
chats, just really quality.
And whenever we review a chat,we try to be as far as possible
and the feedback that we'regoing to pass is constructive,
it's actionable, and I just likegoing to grab you by the hand,
(18:14):
you're going to sit with me andwe're going to go over the chat
together and um see if there'ssomething that we can uh improve
so that next time, uh, we havea case, a similar case, it
doesn't happen again.
But really, really coming froma constructive place and not
like blaming me, like, hey, youdid this bad, not intentional at
all.
Speaker 2 (18:32):
So you mentioned your
product feedback loop and
actually that it's helping youto uncover product issues during
QA, like that negative CSAT.
Do you have tips for otherteams on how to create that
feedback loop, how support canwork with product teams?
Speaker 1 (18:52):
Something that has
been a bit difficult for us to
do was actually to really createa good relationship with the
product team, because theproduct managers are actually
still doing support nowadays.
They're still being in support,so they can also see the
platform and what customers aretelling you about it whenever
they're in support.
(19:13):
But because the product teamhas always worked very closely
with engineers which is expectedwith From the support upside
we've never reached out and said, hey, maybe we can also help.
We've always, whenever we needto file feedback and we just
forgot about it, like, hey, yeah, we tell the customer we file
the feedback to our product team, they'll be in touch, and that
was it.
It was actually raised in aconversation a few months ago by
(19:37):
some of our customers.
I gave feedback about this afew weeks ago.
Do you know if it was takeninto account?
Do I gave feedback about this afew weeks ago?
Do you know if it was takeninto account?
Do you know if?
the product manager read it, Doyou know?
If Customers want to know?
Oh yeah, they want to know andI completely understand.
I would want to know as well.
Like this feature could be likesuper important.
I'm not the only one asking forit, so why isn't it available
yet, which is totally fair.
(19:59):
So we actually started reachingout to our product manager and
building a strong relationshipwith them.
Like we know that you're insupport, you're always seeing
everything that support is doingand all the feedback that we're
filing.
And sometimes we're filingsupport out of a gap in
knowledge from our end, becausethere's something that was done
in the tool that actuallyanswers to that specific
(20:21):
feedback, but maybe we're notinformed about it or maybe the
knowledge base doesn't haveenough information about it.
So we're always asking forvisibility on roadmaps and new
features.
Even if it's just one smallbutton that is going to be like
at the top right corner is goingto help clone something super
easily.
Speaker 2 (20:40):
I think that sounds
like a really great relationship
between product and support.
You mentioned revamping howknowledge is shared internally
and that's really related to QA,you know?
I mean that goes back totraining and your knowledge base
.
Do you have a sense?
Speaker 1 (21:01):
of what doesn't work
as far as knowledge related to
QA Not sharing it and it can befunny, but it's actually
something that we're goingthrough and we've been going
through that for the past year.
But because our platform is sowide and we have so many
features, we're starting to getsome support agents to be
specialized in only a specificfeature, specific features and
(21:23):
specific tools, and I was insupport last week and it was.
It was a very funny case, but acustomer came in asking about
something and I was like readingit, I was like I've never heard
about this before and I wasactually writing in the notes of
the chat that some of mycolleagues were checking.
I was like I've never heardabout this before.
Can we actually do that?
(21:43):
And then somebody actuallyjumped in and said, yeah, we can
do this for this specificplatform so many places where
the knowledge has been sharedthat we don't even know and
we're not gathering it.
So something that we didinternally was creating a
specific channel for all theknowledge shares that we're
gathering from one supportsession, and every time we have
a support session, we just sharesomething new that we've
(22:06):
learned, because maybe it's notgoing to be something new for
others, but it will be somethingnew for a colleague that is in
Guatemala and I'm not going totalk to in the next few weeks
because of time zone orsomething.
Speaker 2 (22:17):
Yeah, so back to QA.
Do you do peer-to-peer reviews?
Oh, yeah, yeah.
Speaker 1 (22:24):
Do you find that?
Speaker 2 (22:25):
absolutely essential.
Speaker 1 (22:27):
Yeah, that was
actually one of the features
that was removed by Zendesk andwe were kind of sad to see it go
because it was something wewere really using a lot.
It really helped us improve,especially from the support
agent perspective, really helpedus improve, especially from the
(22:47):
support agent perspective.
We know that and it's going tosound rough maybe from how I'm
going to say it, but we knowthat when feedback can be coming
from a support lead or a teamlead, in general it can be
perceived as really rough, likeyeah, it's my manager telling me
like I'm doing a bad job, andthen you feel bad about it and
then you really pay attention toit next time time and then you
just seek for approval and justmake sure that your manager is
(23:08):
seeing that you're making aneffort and you're taking into
account what they're telling you, which obviously is completely
understandable and completelylogic.
I abide by that.
But when it comes from a peer,so somebody that you've been
working either in the sameoffice or in another office but
that you talk to on a weekly oron a monthly basis, it's more of
(23:29):
a friendly chat.
You know, like hey, I saw thischat that you had and I was
reviewing this and personally,maybe I would have done this a
bit differently, because if youactually had sent a screenshot
to the customer, you would haveavoided like four or five
messages and the answer wouldhave been there like in the
first place.
I would say in eight out of 10cases, whenever we have
(23:51):
peer-to-peer review, the agentsare always telling me like I
work better when I see thereview from my peers, because I
know that they're in the samesituation as me and if the case
was reversed, it would havehappened the same way.
I would have given exactly thesame feedback.
So we can really see that ithas improved on our end and
support agents are more happy tojust have a session and say,
(24:13):
okay, I'm actually going to dosupport with another person
sitting next to me and we'regoing to review the support chat
that we're working on together.
So I'm just going to open up mylaptop, start working on my
chats.
They're going to work on theirchat.
And then I have a question or adoubt at some point, not about
the issue itself, but more abouthow would you say that to a
customer.
We have a difficult customersor it's a difficult answer you
(24:34):
need to give them and you don'talways know how to do it.
So yeah, peer-to-peer for ushas been huge and it has also
helped build closer relationshipbetween our support agents
together, especially when we hadnew journeys in the team.
Speaker 2 (24:46):
It's kind of like
mentorship um do do all your
specialists offer qa, or is thatsomething that they they train
into?
Speaker 1 (24:56):
they all train into
it and they're not.
It's not mandatory for them toactually do it.
Some of them really want togive review to their peers and
some of them just want toreceive it, and some of them
just don't want actually do it.
Some of them really want togive review to their peers and
some of them just want toreceive it and some of them just
don't want to do it.
They want to receive the reviewdirectly from me or from some
of their support leads.
So I would say it's mostly on acase-by-case basis.
Obviously, when we havenegative rating from our
(25:18):
customers, we're always going togive the review and it's always
going to come from the supportlead level.
But yeah, for the peer-to-peerit's more on the wanting basis
and if they're willing to getthe review and to also give it,
then let's just do it.
So right now, even when we havenegative rating, we have people
(25:38):
just jumping in just to see howthe conversation was handled
and participating in the review,on top of what the support
leads are saying.
Speaker 2 (25:46):
It sounds like you're
really willing to expand or
adapt the QA program to whatspecialists are needing who
wants to offer QA and kind ofwhat they need.
You're really willing to expandthe program or adapt it for
what a specialist needs.
So I have to wrap up.
I have a number of kind ofquick questions for you.
(26:08):
What's something you wish thatmore executives knew about QA
but most of them just don't,maybe don't know or don't care
about.
Speaker 1 (26:17):
So one thing that
actually comes to mind is one of
our co-founders always, always,always told us it's better to
ask for forgiveness than forpermission, and I think we can
apply exactly the same,especially like towards the
customer, whenever we want to dothe right thing by them, and I
think we can apply that to QA,which is it's cheaper to prevent
(26:38):
issues and to apologize forthem.
So if we're able to actuallyreview ahead of time everything
that is currently wrong with oursupport process, we can work on
that, we can improve it, andthen maybe it will cost some
time and some money to actuallydo that, but it will save us a
lot in the future.
Speaker 2 (26:58):
Good point.
I like that, and you knowautomation and AI are definitely
growing.
Do you have thoughts about howQA can make sure, as we automate
more things, we keep the human?
Speaker 1 (27:10):
touch.
Yeah, we're using AI in oursupport process currently and
some of our customers are notsuper happy to be in touch with
a bot and rather be in touchwith a human.
So they're always asking, likehuman, please, human please,
because they want to talk tosomebody, which I completely
understand, and it's true thatthe tone and the empathy that
(27:31):
maybe a human will put into willnot be done by AI, but because
it's a tech heavy environment onour end and our platform is,
well, very, very much tech heavy, it's always good to have that
human touch.
So, yes, we can definitely useai and we're reviewing actually
we're doing qa on all the chatsthat are being handled by ai to
(27:53):
make sure that the answers arebeing provided are actually
correct and accurate and we'reguiding the users towards the
right resources.
But, yeah, we always have thathuman touch even if it's AI
handling it, there's a humanbehind it just reviewing it and
say, yes, this is correct.
No, this is incorrect.
We need to improve this.
So we shouldn't lose our humantouch even if we have AI in the
(28:15):
support process.
Speaker 2 (28:17):
That's a really good
sentiment.
What's one thing you wouldnever do again when it comes to
QA?
Speaker 1 (28:23):
Not having it, not
having it, not having it, which
you did for a long time.
Yeah, we did for a long time,and I felt really bad at first
on why we didn't put the processin before, because even if
we've been a very fast-growingcompany and we've passed from a
(28:43):
small-ish number of customers toa really big number of
customers we have well, I don'teven know how many customers we
have right now, but even ourteam size scaled so much and we
could have made a lot of changesinternally to our processes and
to the way we handle supportway before, even before COVID,
maybe, if we actually had theright tools and we're talking to
(29:06):
the right people about it.
So, yeah, implementing QA assoon as you can in a support
process, even if it's justthrough spreadsheets, but keep
that process in mind and justimplement it, because it's going
to save a lot of time and a lotof money.
Speaker 2 (29:21):
Good point.
So last question if your QAprocess had a tagline or a motto
, what would it?
Speaker 1 (29:28):
be For me it's QA to
save the day.
I love that.
Thank you so much for beinghere and sharing your insights.
Speaker 2 (29:37):
I'm excited for other
teams to get QA going the same
way that you have.
Yeah, and just thanks.
Speaker 1 (29:46):
Thank you for having
me.
That was amazing, I had reallyfun.
Oh good.
Speaker 2 (29:50):
I'm glad that's a
wrap on our deep dive into QA
with Chloe.
If you're ready to up a levelyour own quality program, here's
our quick start checklist fromChloe.
First, lead with the toughstuff.
Do negative CSAT first and getit out of the way.
Two trade quantity for quality,like our other guests have
(30:11):
mentioned.
10 to 20 thoughtful reviewsbeats 100, just drive-bys.
Three make a feedbackpeer-powered to soften the sting
and speed adoption.
So peer-to-peer QA is a hugewin.
Four add screenshots to everyanswer and watch your handle
time drop.
Five escalate through adedicated JIRA board so
(30:36):
engineers see really only whatmatters to them.
And finally, six let Slacknudges keep tickets moving after
day three.
So if a ticket gets stale,nudge your team in Slack, but
definitely automate that.
So I hope you try this.
If you do, even just for a week, let me know how it goes,
because I'd like to see iflaunching a program like this
(30:58):
has an effect on your team.
See you next time.