Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Some of the wins I've
had.
You know, looking at certain bigcompanies' algorithms that
they've spent millions ofdollars on to develop and the
buyer at this big company isreally relying on that algorithm
to tell them what to order,when to order how much.
And there's an example of whenI noticed, by graphing out and
visualizing how theirforecasting algorithm was
(00:22):
forecasting one of my products,one of my coworkers and I we
kind of dug into and we said, ohmy gosh, this is incorrect
because of an anomaly thathappened last year.
How do we get this messageacross?
How do we present this and letthem know that it's wrong?
So we took that information andwe went to the parties that
needed to hear it and wepresented it to them in a simple
way hey, inventory out of stock.
(00:44):
Last year your algorithm didn'tread that because it was such an
anomaly it was an outlier thatit just cleaned that outlier
from the data set.
It ignored it all together.
Therefore, you guys are goingto be massively understocked
this year.
We recommend putting thesenumbers into your own algorithm
because you're comfortable withthat and seeing what happens,
and the result of that was amassive win for our company From
(01:06):
.
Speaker 2 (01:07):
Alloy AI.
This is Shumfley.
Speaker 3 (01:24):
How can you be data
driven as a smaller company
without massive resources?
Speaker 2 (01:29):
How do you plan in an
unpredictable environment?
Speaker 3 (01:31):
What pitfalls should
buyers and suppliers avoid when
trying to make data drivendecisions?
Speaker 2 (01:36):
How do you spot and
capitalize on spikes in consumer
demand?
On every episode of ShellFlight, we answer questions like these
and more, with the help ofleaders across the consumer
goods industry Today we welcomeEric Richardson, manager of
forecasting and data analyticsat USA Opoli.
Eric is a business operationsleader with a passion for using
data and analytics to drivegrowth.
(01:58):
At USA Opoli, he focuses onforecasting consumer demand,
sales analytics, demand planningand supply chain optimization.
Eric has a background infinance and banking and,
previous to joining USA Opoli,was regional operations manager
for the western region at Macy's, where his initiatives
increased revenue by 10% yearover year.
Speaker 3 (02:18):
I'm your co-host,
joel Beal, ceo of Alloy.
Speaker 2 (02:21):
AI and I'm your
co-host, logan Ensign, chief
customer officer at Alloy AI.
We'll be back with Eric rightafter this.
Speaker 4 (02:30):
Selling consumer
goods is a tough job.
Between rising costs, supplydisruptions, price sensitive
consumers and retail partnerswho have their own set of
priorities, it's becoming harderthan ever to execute today,
much less planned for tomorrow.
There's plenty of data to help,but pulling and analyzing
reports from retailer portals isso tedious and time consuming
that you can't respond toproblems at the shelf until the
(02:51):
bullwhip hits you in the face.
That's cow, actually.
Now there's Alloyai to help.
Alloy automatically aggregatesand harmonizes data from your
retailers, supply chain partnersand ERP.
Then we make it easy to findinsights using pre-built
consumer goods specific metricsand dashboards, so you can sense
, predict and respond instantly.
(03:13):
Drag it out today and get ademo at Alloyai.
Speaker 2 (03:18):
Eric Richardson.
Welcome to Shelf Life.
To get us kicked off, we'd loveto just learn a little bit more
about you and your background,your career journey, what got
you where you are today.
Yeah, thank you.
Speaker 1 (03:30):
I have an interesting
background that spans
operations, consulting, bankingand now games.
I did the as you mentionedbefore operations for Macy's and
optimized their supply chain,kind of cut my teeth on that.
Then I went back to school, gotan MBA from University of San
Francisco and did a stint inbanking and capital stress
(03:53):
testing, heavily using data tounderstand capital requirements.
I helped out some startups inthe Bay Area building algorithms
and understanding how toforecast last mile deliveries.
Then randomly I got a call froma recruiter and he said would
you like to join this gamingcompany?
He said they had surfboards,bikes and it was on the beach.
(04:17):
I said sign me up, I'll do it.
Now I do.
I'm the manager of forecastingand data analytics for USAopoly,
based in Southern California.
I work with data every day,from customer data to supply
chain data to financial data.
(04:38):
It kind of really in a weirdfull circle way really
encompasses all of what I didbefore into one job.
I really enjoy doing it.
Speaker 3 (04:49):
Are there surfboards
and bikes, Eric?
Speaker 1 (04:52):
Surprisingly there
were.
They were in the warehouse areadownstairs.
People didn't really ever usethem.
Some of the bike chains gotrusty and the surfboards.
I didn't really trust them.
There's also sharks right offthe coast there and I'm kind of
scared of sharks.
I watch people surf a lot there.
Speaker 2 (05:11):
Well, I think in that
description you use the word
data, by my count, six or seventimes.
Eric, I know you personally aswell.
We know that you very much knowyour way around data.
I think one topic we'd love todive into is what happens when
your buyer doesn't.
As you kind of think about thatdynamic, what pitfalls do you
(05:32):
see people falling into, whetherit's buyers or suppliers, when
it comes to making data drivendecisions?
Speaker 1 (05:39):
Yeah, that's a great
question.
Thank you for having anaccurate count on my use of the
word data.
The way that I approach thisand the way that I look at it is
that there's a human element toall purchasing decisions.
When you are recommendingthings to a company that you're
(06:01):
trying to sell something to,you're trying to convince
someone else that it's a goodidea to spend their money on
your product.
What I've found is sometimes,no matter how much good data you
can show someone, you canbecome more efficient.
For reasons unknown, they justdon't want to purchase.
(06:22):
What I focus on is the idea thatit's kind of like pushing on a
string.
Like you can't really push on astring, it kind of bends, but
you can pull on a string.
So what you have to do is youhave to put all the data in
front of someone in a reallyeasy to understand, digestible
way that anyone that's lookingat this, from the top to the
bottom, can see very clearly.
(06:43):
I will make money from this.
I won't make money from this.
How quickly will I make mymoney back?
And that's really what I focuson.
I pull tons of information.
It's just so much it would makeyour head spin, but a lot of
what I do is making it reallysimple.
Maybe sometimes it's a millionrow spreadsheet and I'll bubble
it down to like four cells justfor someone to look at.
(07:06):
It just really encompasses highlevel.
Here's when you're going tomake the money off of this.
Here's how long it's going tobe tied up for.
Here's the benefit to you, yourreturn.
Speaker 3 (07:15):
Eric, do you find
that the you know?
You say, hey, I'm going to havemillions of rows of data.
Right, there's a lot of dataout there in the world.
Now, today, you want to bubblethat up to a couple key things,
right, simplify it.
Do you find that?
The things that you're focusingon?
Does that depend a lot on whoyou're talking to, or is it the
same general points that arekind of being made over and over
(07:38):
again?
Speaker 1 (07:39):
There are
commonalities of like things
that are always the same, thatpeople want to see, but a lot of
it is knowing your audience.
One thing I learned in schoolwas sell to your audience.
You know you have to understandyour product, but you also have
to understand the person you'retrying to sell it to even
better.
So when I'm presentinginformation to someone, I'm
(08:01):
essentially selling something tothem, but I'm just giving them
data, trying to make thembelieve in what I see.
Speaker 3 (08:08):
So, yes, it has to be
tailored and simplified Data
tells stories and, as you said,you know the data is the data,
assuming it's correct.
You know it's factual, but thedata that's presented the way
you present it, can tell quitedifferent stories, right?
You see this across academia,science, etc.
How often do you find you'reable to change someone's opinion
?
Because it's one thing, oftendata confirms things that we
(08:30):
already think, we know, huncheswe have.
It could be a little harderwhen you're like, wow, the data
is very different than what Iexpected and I'm curious if you
have examples of that or ifyou've seen that find it happens
frequently, infrequently.
Does that depend whether you'retalking to somebody internally
or you're trying to?
You know sell to a buyer at?
You know a retailer, etc.
Speaker 1 (08:49):
You know, I wish I
could tell you people change
their mind more than they do,but that is one of the tough
things you can throw out themost accurate information to
someone.
They may not receive it asaccurate or they might have
other sources that are tellingthem other things.
Some of the wins I've had, youknow, looking at certain big
(09:10):
companies, algorithms thatthey've spent millions of
dollars on to develop and thebuyer at this big company is
really relying on that algorithmto tell them what to order,
when to order how much, and youknow down to the end of down to
like the time that they need topick it up and ship it to get it
into their warehouses.
And there's an example of whenI noticed, by graphing out and
(09:31):
visualizing how theirforecasting algorithm was
forecasting one of my products,I noticed something funny and
one of my coworkers and I wouldkind of dug into and we said, oh
my gosh, this is incorrectbecause of an anomaly that
happened last year.
How do we get this messageacross?
How do we present this and letthem know that it's wrong?
(09:51):
It came down to simplifyingagain and then not talking about
why the data was incorrect, buttalking about the main causes
as to why the algorithm wasmisreading the historical data.
So we took that information andwe went to the parties that
needed to hear it and wepresented it to them in a simple
(10:13):
way hey, inventory out of stock.
Last year your, your algorithmdidn't read that because it was
such an anomaly as an outlierthat it just cleaned that
outlier from the data set.
It ignored it all together.
Therefore, you guys are goingto be massively under stocked
this year.
We recommend putting thesenumbers into your own algorithm
because you're comfortable withthat and seeing what happens,
(10:36):
and the result of that was amassive win for our company
because the circle data wascorrected and they accepted the
change that we recommended.
But it doesn't happen like thata lot with some big retailers.
That's kind of the exception tothe rule there.
Speaker 2 (10:52):
But what I'm hearing
is don't let that discourage you
, it's still worth it.
Speaker 1 (10:55):
Yeah, the fun for any
forecaster or quant is digging
into it and looking at the datato find the root causes of what
are making these numbers appearin front of me, what's causing
those numbers to change.
So a lot of the fun is islearning through through digging
(11:15):
in and exploring the data andjoining it with other sets.
And if the end result is a goodone, like like someone accepts
and takes your recommendations,that's awesome.
If someone doesn't, you simplylook at that and say, look, I
learned, my team learned, we'reall smarter and better because
of this.
And now we know, next time thiscomes up, we're going to put it
(11:36):
in front of the parties thatneed to see this, maybe in a
different way, and maybe we'llget a different result, but it
doesn't stop us from doing it.
Speaker 3 (11:44):
Eric, you were
mentioning earlier, I guess, the
counterparty in this being alarge company, a large retailer.
It sounded like lots ofresources, lots of, probably,
data scientists and a largeengineering team to build all
sorts of algorithms.
You know, tell us.
You know, usa Oply, you know,smaller company, although your
products are all over the place.
Logan and I were just chattingabout how we both interacted
(12:06):
with them just over the weekendin unexpected places, you know,
in a national park, in my case,with one of your grand or your
national parks monopoly, whichwas cool to see.
But how do you, as a smallercompany, how do you create this
data driven culture?
How do you invest, you know, ina way that's appropriate for
kind of your size and probablythe resources and kind of create
(12:29):
that culture?
We have a lot of smallercompanies that I think are
trying to figure that out.
Speaker 1 (12:33):
Joe, you hit the nail
on the head with when you're
saying culture.
It is about creating a cultureand not creating a number system
or creating a data practice.
It all comes down to theculture.
So when I came into my company,there was a forecasting
function that had been runningfor 10 years.
(12:55):
They largely already knew whatthey were doing.
They could forecast really,really well, but there was a lot
of hesitancy in uppermanagement as far as accepting
and trusting the informationthat was coming in front of them
and then using the informationthat was great information that
was being generated to thenimpact and drive business
(13:16):
decisions that would driverevenue and drive efficiencies
and cost savings.
I made the mistake of coming inat first like a bull in a china
shop and saying, oh, look atall this stuff, let's do this,
let's do this, let's do this.
And it wasn't accepted too well.
But what I learned from that isthe importance of starting
small.
So in any company that's tryingto adapt a data driven culture,
I would encourage the folksthat are trying to implement
(13:39):
that To start small, start withlittle wins.
So start by validating andverifying hey, my company is
already doing this.
Why don't I show them the datathat proves that, hey, you made
a great decision by doing thisbusiness XYZ decision.
I'm going to show you how thatimpacted your revenue.
(13:59):
I'm going to show you how thatimpacted your inventory on hand.
So you start small and getlittle wins and you kind of
build from there.
And what I found at my companyis that the more little wins
that you can get and the moreyou can show them that, hey, you
already know this innatelybecause you made a great
business decision, the more thatyou can show them that the data
supports that, the more peoplestart to trust and say, hey,
(14:25):
what does the data say?
One of the cool things for mewas that the first time someone
in the boardroom said what's thedata saying about this?
Let's ask Eric to model thisbefore we make this decision.
And that was a cool moment forme because it had taken maybe
three and a half years ofproving little wins and putting
(14:45):
data and models in front of thebosses before they were
comfortable with getting aheadof it.
So start small, prove out whatalready is doing well.
And then I would also encouragepeople make sure that when you
are trying to develop a datadriven culture, make sure that
(15:07):
you don't throw data out there,that's just data.
Whenever you're presenting dataor analytics to someone, you
should always give the businesscase as to why the data says.
This Data says buy here, sellhere, but then you have to
translate that into the businesscase around.
This is by this product, bythis product, hold it here, move
(15:31):
it from here, move it from here.
So you have to quantify and putcontext around the data rather
than just throwing numbers outof people.
I would also say reallyimportant to driving data driven
culture is hiring really smartpeople that are smarter than you
are and not being afraid to,not being afraid to let them run
(15:51):
free, not being afraid to givethem a long leash per se, to to
have fun and explore andexperiment with the data,
because you never know whatyou'll find.
We have a really, really smartyoung programmer, a database
analyst, that works with me andI always tell people without him
I wouldn't be able to do my job.
The things that he thinks ofand the way that he writes codes
(16:15):
and the way that he kind ofserves up data for me to analyze
and make great businessdecisions with is really awesome
, but he's super, super smartand I want him to take my job
someday.
So you have to not be scared tohire talent that will surpass
you.
Man, I love all those points.
Speaker 2 (16:31):
That first point,
eric, is fascinating to me on
data culture, that often datacan reinforce things we already
know and that's an importantpart of the change management
journey.
That's not a waste of time.
And then, what I also heard wasmake sure that, as you present
data, that you're presentingcontext and insights.
(16:52):
And then, lastly, don't beafraid to hire hire great people
.
Yeah, I think that's that's afantastic sort of blueprint to
help organizations go on thatjourney.
Speaker 3 (17:01):
So, as we look into
the future of analytics and data
, obviously everyone right nowis talking about AI.
I think the focus has probablyshifted a little bit from
machine learning a couple yearsago and what was going to happen
there with generative AI morerecently.
I'm curious if you've playedwith any or either of those and
(17:22):
what you think the impact willbe on analytics going forward.
Speaker 1 (17:28):
Yeah, I mean in grad
school in Silicon Valley, pretty
much in San Francisco, we wouldbuild models like the random
forest models and the largelanguage learning models.
When we would build those outand train them and then have
them give us results andpredictions, this was what 10
(17:48):
years ago, we were thinking man,this is so cool, but I don't
know if we're ever going to havethe computing power or if
anyone will ever massively adoptthis.
Now, fast forward 10 years andit is everywhere mainstream.
I think it's largely a goodthing.
I think it will help simplifysmaller tasks, which it's
already doing helping you manageyour calendar at work, helping
(18:15):
you write better code, helpingto check to make sure that you
might not have made simple inputerrors when you're inputting
things into a spreadsheet.
I think, on the whole, there'sa theory out there that AI is
going to take people's jobs.
I think that AI is going toreally amplify people's jobs and
(18:35):
give people more time to hangout with their family and do
things that they like doing,because it will make them more
efficient.
As far as in forecasting andsupply chain optimization, ai
has been around for a long time,in the simple fact that when
people say artificialintelligence, it's these large,
large, large models, but whenyou boil those models down, it's
(18:59):
just a simple equation, like,hey, I have a Y variable that
I'm trying to predict, a Ytarget that I'm trying to
predict, and then here are my Xvariables, like the things that
will make that thing happen.
So those simple equations havebeen used for years and years in
forecasting, and I think thecomputing power that comes with
(19:19):
AI and the power of AI to justkind of really comb through what
everyone else has done in thepast, I think it's going to make
my job a heck of a lot easier,to the point where I don't have
to go through every skew thatI'm trying to forecast and
figure out what's my seasonaldemand pattern or what's my
(19:41):
profile for when the demand willhit.
I see AI being able to justreally, really quickly snap the
fingers, serve that up to me andthen I can really act on it.
Speaker 3 (19:52):
Well, I think that's
a nice segue.
As we talk about we're in aworld is particularly the last
couple of years, where I thinkthere's more unpredictability
than ever before.
I mean, there's been hugeswings in demand when COVID hits
, during COVID, now post COVID,if we can call it that.
I know that's really impactedthe toy industry.
(20:14):
I'm curious to hear how youkind of dealt with that tricky
couple of years that probablycontinues even now.
Yeah, just how you kind ofresponded then and how maybe
with things like AI, other tools, you get better at being able
to respond quickly in the futureto future changes that we can't
predict.
Speaker 1 (20:31):
Yeah, covid through
everyone an unpredictability
wrench.
It flipped a lot of things onits side and all the norms that
we were used to changed.
For the toy industry.
It made a lot of changes forthe better.
Everyone couldn't go out.
So what are they going to do?
(20:52):
They're going to buy toys andgames and they're going to enjoy
time with the only people theycan interact with their family.
Across the whole toy industry wereally saw a massive demand
spike.
To be honest, none of uspredicted that, because none of
us could have predicted COVIDhappening.
And then, following thatmassive demand spike, was a
(21:14):
supply spike which just sohappened, got held up off the
port of Los Angeles and createda lot of empty shelves and tied
up a lot of working capital incontainers for a long time.
But one of the cool things thatmy team learned from that big
unpredictable demand spike thatwe couldn't have known about
until it actually happened is wewere pretty nimble.
(21:37):
We would watch our data veryclosely on the daily level,
sometimes on the hourly level,minute level, but we watched one
retailer very closely and theirdata in some ways represents
demand overall.
It can be used as a proxy forwhat the market is thinking and
we saw one interesting metricstart to spike.
(21:58):
My analysts and I were lookingat it and we're like that
doesn't really make too muchsense.
Why are people looking at theseproducts so much During
Christmas it makes sense but notin February or March or April?
This doesn't make sense.
So we looked at that.
We said, all right, let's comeback to this in a week and see
(22:20):
if this is a trend or if this isjust an anomaly.
Sure enough, we went back in aweek and the interest in these
products that we are measuringvia people looking at them and
interacting with these productshad really went through the roof
.
So we made a very simplifiedset of data and took it to our
(22:42):
production managers and said,hey, I think that people are
going to be buying games a wholelot because of what I mentioned
.
They're only with theirfamilies.
It's a really fun way tointeract and it's relatively
inexpensive.
Here are five products that weneed to really invest in now and
get ahead of and bring supplyin to meet this demand.
(23:03):
I think it's gonna keephappening.
So we did that with a couple ofproducts and got ahead of it,
and we had inventory in ahead oftime.
The problem was that we were alittle too conservative, like we
couldn't have predicted that itwas even more so.
We sold through everything wehad.
We saw that spike and webrought the inventory in on time
(23:24):
, but it still wasn't enough.
We had a lot of really goodlearnings from that as far as
understanding tolerances in ourmodels that we build,
understanding how consumersthink and then also having a
great outlier case that we couldthrow under our models when we
wanna throw someunpredictability into them in
the future.
I would encourage the data folksthat are helping to give
(23:47):
information of people makingdecisions.
Trust, but verify.
So if you see things that areanomalies in your data, make a
note, flag it and see if itcreates a pattern.
Once it creates a pattern,don't be afraid to build the
business case that supports whatyou should do with this pattern
that you're seeing and then acton it, but always put good
(24:08):
numbers behind it as to theprobability of it happening, how
much incremental revenue it'sgonna get you and what the cost
is.
Speaker 3 (24:17):
If we go back to
earlier in our conversation
around, you bring data and it'sa question of how often can you
convince someone, changesomeone's mind based on that?
You use this example of okay,you're seeing interests, people
are looking at your productsonline.
Maybe it hasn't quite convertedinto higher sales yet.
I mean it's crazy time right.
(24:37):
Everyone's, as you said,holding in their houses and you
flag it.
It sounds like you come backlater.
You sort of say okay, we reallythink there's something
building here.
You go to your productionmanager and say, hey, I want
more.
Was that a conversation?
Did you find that?
People said yeah, that makes alot of sense.
I can tell the story to myselfaround why there's this
(24:58):
burgeoning demand for ourproduct.
Or was it a debate internally?
I have that play out.
Speaker 1 (25:03):
Yeah.
So at first, when I showed itto my analysts and the other
team members, they were kind oflike you're crazy.
And I was like, well, let'sjust all watch it, so we can all
be in this together with thiscrazy decision.
And then we went to theproduction folks.
So we went to the finance folksthat would be paying for the
production and we presented itto them and they said we think
(25:25):
you're probably onto somethinghere, but how can we know?
So what we did is we trustedbut verified.
We modeled backwards.
So we essentially back tested amodel and said this metric that
we were using, like the interestin the product without buying
it in the past, how often hasthat interest correlated to a
purchase?
So, meaning, for every personthat looks at this, how often
(25:48):
did they buy it?
So we ran that model and thenwe back tested it.
And then we ran it with thenumbers that we were seeing and
we said, okay, this is a littlecrazy and it's out of the
ordinary for anything that we'veever seen.
But, based on how history hasplayed out previously, this
interest should translate intothis many purchases.
(26:10):
And when we put thatinformation in front of them,
the green light was given totest it out, saying, all right,
cool, let's invest some money inthis inventory that we need
anyway.
If it doesn't sell now and yourprediction is wrong, it'll sell
by Christmas anyway.
So really helped them.
It helped change their opinionby back testing the model and
(26:32):
showing them the proof that, hey, this metric I was looking at
that I think, is gonna impactpurchases.
In the past that actually haspredicted purchases, so we
believe it will now.
Speaker 3 (26:42):
I love the trust, but
verify back testings obviously
always a great way of saying,okay, if we're seeing this,
let's see how this model wouldhave worked in the past.
I think a good way, in myexperience, of helping build
that trust.
But even your comment at theend you did that and it was
probably okay, let's try thisout.
And then in retrospect you'relike man, I wish I would have
doubled that order, yeah.
Speaker 1 (27:04):
I mean, in retrospect
, we all wish we would have
doubled and tripled our orders,but it always comes down to risk
tolerance and understandingwhere your organization is and
where your capital requirementsare.
And at that same time, wedidn't know if we were gonna get
paid from vendors and we didn'tknow what everything would be
as far as cash flow.
(27:25):
So it was a calculated riskthat paid off.
But yeah, looking back, I wishwe would have purchased five
times more.
Speaker 3 (27:31):
Yeah, I think there's
a lot of companies in that boat
, but they didn't even get thatfirst one, so that already got
you ahead.
Speaker 2 (27:37):
Well, eric, I'll say
one of the things I just love
about these conversations isgetting to learn more about
folks, businesses and theirproducts.
And I think you all have apretty unique model and so,
particularly around licensing,we know that's really big in
toys and games, but we also knowit's maybe especially critical
(27:57):
for you all in your businessmodel.
So I'd love to hear more aboutkind of that business model.
How do you make decisions aboutwhat IP to license?
What games have that longevityand popularity to fit your model
?
We'd just love to hear moreabout that.
Speaker 1 (28:13):
Yeah, that's a great
question.
I love talking about this.
So at the core of our businesswe produce board games.
We have our own IP blank slate,tell illustrations, cues and
cues tappel.
Many people may have playedthose games, but we have another
segment of our business thatdoes a lot of money, where we
(28:34):
work with large board gamemanufacturers like Hasbro, and
then we tie up their gamemechanics with really cool IP
that people are interested in.
So if you're a big Star Warsfan or if you're a big Frozen
fan, we're gonna give you a gameyou're familiar with, like
Monopoly, and we're gonna letyou play it as Elsa or as Darth
(28:56):
Vader.
The interesting thing aboutlicenses are a lot of it comes
down to measuring consumerinterest.
So when we are evaluatinglicenses, we wanna look at
what's the staying power of this.
Is it just a flash in the pan,like a TV show People are really
interested in?
Then, once the season ends, noone's gonna be interested in.
(29:19):
Our games are gonna sit on theshelves.
A big way that my team and Ihave impacted that is helping
the our organization makedata-backed decisions on
choosing licenses is we've builtmodels that take into account
customers' propensity topurchase, meaning like looking
at other games and otherproducts that are similarly
(29:41):
licensed.
So is this a very fanaticaudience that searches on Google
a lot for it and then doesn'tpurchase, or are they an
audience that tends to purchasea lot of the products or they're
committed to it?
So we found Disney, star Wars,super Mario.
A lot of these licenses reallyhave staying power and I think
(30:04):
we've just scratched the surfaceas far as tapping the consumer
base and the consumer that wantsto buy and really interact with
their favorite licenses.
We use a lot of Google trends.
We use a lot of market researchinformation about who else is
offering this type of licenseand what price point is it at
and how long has it lasted, andthis is something that we've.
(30:27):
It's really been part of thecreating a culture that makes
data-backed decisions, becausethe license acquisition process
used to just happen.
We had experts that knew thelicenses and they had experience
with the people that werelicensing them out.
So we got them and we triedthem, but what we found is there
(30:48):
would always be 30% that didn'thit and my CFO kind of
challenged me and said how canwe figure out how to make only
10% mess?
So we developed thismethodology around understanding
propensity to purchase andunderstanding how deep someone
is willing to go.
If someone's willing to committheir harder in dollars to your
(31:12):
product and that license overand over, it's worth committing
to and saying we're going toinvest three years of
development into making a gameand printing a whole bunch of
them.
Speaker 3 (31:24):
Do you generally know
when you get a license again
getting back to this kind of gutidea, will you be like I know
this one's going to be a winnerand there might be others where
you're like it may work, it maynot.
I'm curious how that maybe thegut reaction to brands.
We all have things that wegravitate toward and how that
(31:44):
aligns with how things actuallyperform.
Speaker 1 (31:46):
Yeah, it's funny.
You asked that because peopleat my work like make fun of me
sometimes in a nice way.
They're like oh, I bet Ericlikes this one.
It's not going to sell Causethe one game that I really
sucked my neck out on.
I loved it, I played it and Ithought it was so fun, it was so
great.
So then I forecasted a wholebunch and we bought not that
(32:06):
much, thank God, and it was justsuch a dog, it bombed.
So I clearly wasn't the targetaudience for that.
But we do have kind of a gutreaction to this could be good,
this could be bad, but I like tothink in ones and zeros or
black and white.
So when I'm making a decisionor I'm recommending a decision
(32:30):
for a license, I need to stripthat decision of all of my
biases and I need to strip allof my emotion from that decision
and go all right, I really likethis.
So I'm going to really dig inand try to prove with the data
that this is going to work.
Or if I don't like it, I stillhave to do that same process.
So it really has to be processdriven.
(32:51):
But I would say my track recordfor my gut is really bad.
So I'm really happy that I havea great team and we understand
how to analyze it from a dataperspective, because if we
trusted my gut, I don't know ifour company would be around
still.
Speaker 2 (33:05):
Well, in the journey
to get there, I mean, was there
a lot of iteration and feedbackloops based on okay, these were
our winners and this is what wecould have seen beforehand, or
was it pretty clear to you?
Oh, let's look at thesedifferent data elements, and
that's going to be reallyhelpful in informing what to
make bets on.
Speaker 1 (33:25):
Yeah, that's
interesting because I mean we
have had so many meetings aboutthis and so many strategy
sessions about this, because thequantitative side of it's
really easy.
All right, what you get together, you know you get some coffee,
you get a whiteboard and somemarkers and you write down all
of the variables and they're allthings that you can measure
(33:45):
quantitatively, that you thinkcan predict the success of
something.
But what we always get hung upon and it's so hard for us data
people to measure is how do youmeasure the qualitative things
Like hey, logan really likesthis, or Joel has a Squishmile
as key chain, so he's going tobuy more Squishmiles than so?
(34:09):
The qualitative factors wefound involving teams that are
not familiar or may not even becomfortable with data, but they
have really good depth ofinsight as to how those licenses
work, what the fan base is, howthey react to conventions.
You know We've even tapped into.
(34:30):
You know, tell me how manypeople come dressed up as these
characters from this license atComic-Con and we're like that's
a predictor that people arefanatical and they may buy it.
So then we take that and we go.
It does this scale?
So we look even bigger.
Speaker 2 (34:46):
Well, one more
question here for you, eric.
We have heard from other folks,other customers of ours, a
rethinking in how peopleapproach manufacturing,
particularly manufacturing inChina.
I'm curious if you've got aperspective there how you all
are thinking about that.
Are you evaluating differentways to source product?
(35:08):
We'd just love to understandyour thinking there.
Speaker 1 (35:10):
Yeah, that's a great
question and putting all the
political answers aside andlooking at this from the
business perspective, the roleof any forecaster is to measure
demand, try to predict demandand then create enough supply to
(35:33):
then meet that demand.
A key element to that is timeof transit.
So if we need stuff on shelvesin a month but our production
lead time is three months, we'rekind of in a bad situation.
So we have started to finddomestic options to have dual
(35:58):
production.
So maybe we have capacity andcomponents for 20% of our full
production plan in a warehousein the United States where we
can surge and create that andget it to market in three weeks
to a month in order to meet thedemand.
With that comes it's a littlebit more expensive but it pays
(36:21):
off because you get to meet thedemand and you get all the
incremental revenue.
We've just started doing that,probably since COVID, I think,
for the industry as a whole.
Covid was a real driver forthat because we all kind of
realized, as all of our productswere sitting off the coast not
being sold, that we kind of needa different solution to this.
So I think industry-wide we'vekind of adapted the dual
(36:43):
procurement strategy where youget enough components and you
get enough raw materials to makea certain amount to surge for
demand.
One of the things that peopledon't think about a lot with
overseas production iscounter-fitting of products.
So I'm a member of the IACC isthe International
(37:06):
Anti-Counter-Fitting Coalition.
We meet in Washington withHomeland Security and reps from
Amazon, alibaba and we talkabout strategies of how to clean
the markets up.
From a board game perspectivethey're made out of paper and
plastic.
They're not a ton of money toproduce, so you get a lot of
value out of it for the consumer.
(37:29):
That same factor of it, you know, not being very complicated or
a lot of money to produce, hasled to lots of counterfeits.
We find that when we reduce inthe United States, zero
counterfeits.
When we reduce overseas, thelonger we produce the game, the
more counterfeits introducethemselves into the market.
And the more counterfeits arein the market, the more it
(37:51):
dilutes your brand, the more itsteals sales away from your
partners who you work reallyhard to get a lot of sales and
to save that shelf space.
So to me, looking at it, it'snot only from a just-in-time
perspective and meeting demand,it's also how can we protect our
brand the best and what are thebest decisions as far as
(38:12):
production location that willbalance those two things Very
interesting on thecounterfeiting point.
Speaker 3 (38:17):
So to summarize that
it sounds as though the mindset
is really that kind of flexproduction, that last 20% you
have more localized so you canflex that up and down based on
maybe shorter term fluctuationsin demand.
But it's still cheaper toproduce overseas to the bulk of
production at least as there fornow.
Speaker 1 (38:36):
Yeah, it's all about
the mix or the weighted average
of your production cost.
If we don't have to use thesurge demand, it's great and our
cost of goods is 10% lowers orso.
But when you do have that builtin where you can flex if needed
, we've seen a lot of successwith that.
The goal is just to forecast itwell enough to get it all
(39:01):
overseas.
Speaker 3 (39:02):
but I wish it works
better more.
If you just nail that forecastperfectly, problem solves that
three month lead time doesn'tmatter if you could forecast out
six with perfection.
Speaker 1 (39:15):
Yeah, I wish it was
that easy.
Bringing it back to AI, thoughI really think with AI driven
methods, I think that's going tohappen more and more frequently
.
Where we're going to have lessmisses on our forecast, there's
going to be a lot better leadtime and better visibility into
where demand is going.
(39:35):
So bringing it back to that,that's something I'm really
excited about for AI to make myrecord a little cleaner.
Speaker 3 (39:42):
There you go.
Well, I think it'll be both.
I think the forecasts are goingto get better and you certainly
see this a lot in fashion withfast fashion.
But there's also that abilityto capture those smaller,
shorter term trends.
But your supply chain has to beable to handle it and I think
that's to me a very interestingapproach of saying look, is low
(40:03):
cost manufacturing going to comeback to the United States?
Probably not fully, but thatability to say, well, I'm seeing
short term trends that I cancapture, I can get those
incremental sales, I'm willingto pay a bit more for that
production.
And I think AI, just like themore data that we're collecting,
you kind of see those shorterterm patterns that maybe fast
(40:25):
agile brands can adapt to prettyquickly.
So I think you're going to seeboth as my take.
Speaker 1 (40:29):
I think you're
correct on that 100%.
I think it will start to happensooner than later.
It's all about what companiesadopt it, and then it gets
cheaper and then everyone adoptsit.
Speaker 3 (40:42):
Well, Eric, this has
been fascinating.
Yeah, thank you for having me.
Speaker 1 (40:46):
I just I love talking
data, I love talking supply
chain and I love talking withpeople who understand what I'm
talking about when I'm talkingabout it and they don't just go
huh or have like a blank look ontheir faces.
So I really appreciate you allhaving me on the program.
I'd love to come back any othertime to talk about what I'm
passionate about data and supplychain.
Speaker 2 (41:06):
Amazing.
Well, and I don't know if Ihave the final count of data,
but maybe we cleared 100 in theword count there.
We'll look back at thetranscripts.
Speaker 3 (41:13):
We'll have to add a
counter, and you know.
Speaker 1 (41:16):
Well, I was actually
using AI to count.
We're at 258 right now.
Speaker 2 (41:19):
All right, well,
fantastic, eric, and thanks for
joining us here on Shelf Life.
Thank you, You've beenlistening to Eric Richardson,
manager of forecasting and dataanalytics at USA Oply.
That's all for this week.
See you next time on Shelf Life.