Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Honestly I'm just struggling right now because I am in my
head now picturing Bowser with aFrench accent and I'm going to
have to go find this after the show so.
I just made it up, but that would be hilarious.
Really. Welcome back to Chain of
(00:20):
Thought, everyone. I am your host Connor Bronson,
and today I'm joined by Carly Taylor.
Carly is the field CTO for gaming at Databrex and founder
of GGA iCarly. Welcome to the show.
It's great to have you here. Thank you, it's so good to be
here. It's really honestly exciting
for me to have this conversationbecause similar to you, I have
always been fascinated by the field of gaming, both tabletop
(00:44):
and and digital. And as you've noted to me
before, gaming has driven the advancement of GP, US graphical
processing units, and it's really the tech that has built
the current AI boom as we understand it today.
I know most people understand atleast a little bit of this
story, and I'll recommend the incredible acquired episodes on
(01:06):
NVIDIA if anyone wants to go much deeper from a podcast
format. But can you start by connecting
the dots a bit for our audience?How did the video game industry
enable today's AI boom? Yeah, absolutely.
I mean, I could talk on this foran hour and I actually have
before at conferences to the thenerds who are interested in
things like this. But you know, it really goes
(01:29):
back to even before the 1st GPU hit the market, which was
invented by NVIDIA about 25 years ago, gamers have been
pushing the limits of what theircomputers can do since we
started making video games computers, right.
I think back to Roller Coaster Tycoon, which was written by
(01:49):
Chris Sawyer, I think his name for.
The record one of the best gamesever.
One of the best games of all time still, and it has staying
power, right? And what's crazy?
Oh, sorry, go ahead. No, I was just going to say I
picked it up on Steam recently and played it for a day, and
it's just like, yes, it's still nostalgic, it's still fun, Yeah.
What's crazy about that game is that he wrote that entirely by
himself in assembly. And the reason that he did that
(02:13):
and any anyone who programs willknow, like assembly is like the
nerdiest of nerds languages. You know, you're basically
talking straight to the to the machine at that point.
And the reason he did that was because the CPU at like power at
the time couldn't handle the types of things he wanted to do
with like guest interactions in the way that he wanted this game
to work. So in order to hyper optimize
(02:35):
this game, he just wrote like pure machine level code so that
he didn't have to deal with any abstractions or anything that
would slow down like what he wastrying to build.
And I think that kind of encapsulates the gamer mindset,
which is like your only limit isyour imagination of what you can
build. And they won't let things like
compute power slow them down. And so I think that, you know,
(02:58):
that mindset set up a trajectoryfor gamers to continuously push
the limits of what compute coulddo.
When the first GPU hit the market, like I said about 25
years ago, I think it was calledthe G Force 256.
It had the processing power of like a potato nowadays, right?
Like it, it was very rudimentaryfrom what we consider today.
But over time, as GPU started tobecome more used in gaming, you
(03:24):
know, they really accelerated graphics processing and the
ability to make world's richer and look more vibrant and look
more realistic. You want more polygons.
You want things to look more real, like you need a better
GPU. And you know that that consumer
drive forward, not only, you know, made NVIDIA the company
that it is today and allowed them to innovate and be the best
in the world at doing what they do, but it drove the price at
(03:48):
every point of those GP US down.And you can actually see this
trend over time when you look atbasically the cost to do a
calculation on a GPU. It's the best way to normalize.
It is like, what does it cost todo some sort of matrix
calculation? And the cost for each
calculation over time has just gone down continuously.
(04:09):
So now we're at the point where we have these insane data center
GP US that are used, you know, primarily for generative AI.
They're not really used for gaming, although that would be
sick. Oh, I mean I can only imagine.
Yeah, right, you might. Be able to use all the capacity
but we. Should be running it everything
at like the highest setting possible your electric bills
(04:30):
like $80,000. You know, now we're we're at
this point where it's it was possible to build those data
center GPU's at the price point that they're at today.
And that's not to say they're not expensive.
I think one of those is like 70 grand.
Those aren't for people to buy. Those are enterprise level.
But you know, without the push, the consumer push for GPU's and
the acceleration in that development, making things
(04:52):
cheaper, making each of those calculations cheaper over the
years, I don't think that we would have been primed for the
AI revolution that we have todaybecause the compute power
necessary is just incredible. Absolutely agreed.
And it's interesting to see how the Kudo ecosystem and
everything in video has built there, really set the stage for
this last 10 plus years of development in AI.
(05:14):
And as you mentioned, you've gone deep on this topic before
and maybe we'll have to have youback further full episode where
we just focus on that. But I love that you've been able
to carve out such an interestingniche for yourself here as a
director of engineering and Activision, then Microsoft, a
founder, an influential industryvoice with over 170,000 LinkedIn
followers today. You've obviously done a ton of
public speaking, and you're one of the lucky folks who gets to
(05:36):
talk about video games and theirimpact on AI and data every day.
What first made you fall in lovewith gaming?
Was it Roller Coaster Tycoon? Something else?
I think that was definitely partof it, if I'm being honest.
That was one of my favorite games.
But I I guess I've identified asa gamer ever since I was really
young. I started playing video games
(05:57):
with my mom when I was probably too young.
Nowadays they'd say, you know, don't let your kids sit in front
of screens. But I was like this far away
from like an old CRTTV, just with my nose straight up against
the glass, you know. And I think that as an only
child, like experience of not just playing with someone,
playing with my mom later, gaming on, you know, like
(06:19):
sitting next to each other on the couch playing video games
with your friends was like such a just such a vibe.
And then growing up into multiplayer online games, I
think that it's just been something that has been part of
for me, like recreationally, butpart of my life since I was
young. And then it really like, I think
(06:40):
the role gaming played in my life was really exemplified for
me during COVID. Cause like everyone else, I was
stuck inside. But also like everyone else, I
was playing Warzone basically constantly with my friends.
And it was a great way to talk to people 'cause we were all so
isolated. So we'd all sit down, make a
drink, hop on, you know, and just kind of just like, talk
(07:02):
crap with one another and hang out.
And yeah, it was a really good time.
And it was at that time that I saw the job posting at Call of
Duty. They were looking for a data
scientist. And I remember talking to my
squad on Warzone and saying they're looking for a data
scientist for Warzone. Wouldn't it be crazy if I got
this job and like, the rest is history?
(07:23):
But yeah, I think I've just beena gamer.
I don't know anything different.That definitely resonates with
me as well. Right there with you with Roller
Coaster Tycoon. And then for me, like my first
time really getting into coding,like I, I'd started to do a
little bit helping refurbish computers on, you know, basic
stuff around getting using a command line to actually
(07:46):
refurbish them and, you know, update the software.
But then it was like building websites for a Runescape clan I
was running. I just like honestly, my first
time doing like quite a bit of coding.
So it's funny to see how gaming will drive folks to take on
these new challenges. And you know, you mentioned the
social collaborative element. I was doing a ton of Among Us
during the pandemic, or it's like my friends and I would get
(08:07):
together and, but let's do a little murder mystery together.
So I, I totally hear you on that.
And it's, it's really cool to see how you've not only been
such a integral part of the industry and seen it up close,
but have applied your backgroundas a, as a coder and data
scientist and looked at uniquely, I think, how
traditional game development hashad a kind of a data problem.
(08:32):
So let's set the stage for this conversation with some grounding
and how game development happens, plus how game data is
captured, since I think that'll help inform this whole
conversation around applying AI to gaming and how it flows into
one another. So we'll dive into the
intersection of AI and gaming a bit later, but can you walk us
(08:52):
through how games are traditionally built?
Particularly focus on how data and telemetry fit into that
process? For sure.
Yeah, I think it, it really, it depends.
Like the traditional game development process is going to
vary based on if you're at a startup, if you're bootstrapping
things yourself, if you're a AAAgame studio, it might look
different. I think the unifying theme
(09:12):
though, is that game developmentis creatively driven, which is
good, right? Like that's how we get these
immersive worlds and these really enriching stories.
The issue with traditional game development, though, is that
when you're coming at things purely from a creative
standpoint and you're translating that into
engineering requirements for howthe world needs to be built,
(09:36):
you're thinking about data. But you're thinking about data
to accomplish a singular goal, which is to get the game
outdoor, right? Everyone wants to make ship.
Like that's the goal, right? And you see when things get
pushed back, like Grand Theft Auto just got pushed back again,
like there's a there's a community reaction and there's a
cost to not shipping your game. And so like these deadlines are
(09:59):
taken very seriously and workingin game dev, I can tell you that
like you were constantly counting down the minutes until
you go to alpha, until you go tobeta, and then until you go to
ship. And again, driving that process
is this creative, iterative process.
There's green lights, there's all sorts of things that studios
are balancing and juggling and trying to get just to get the
(10:20):
game out the door. And from what I've noticed,
often times Data ends up becoming a second class citizen
in those conversations because while critical to the game
development process and specifically to the live
operations process, which is howyou support your game once it's
out, you can still ship a game without Data.
(10:41):
You just won't know anything about what's going on.
You know, there's things in telemetry that you need to know.
And as the game is being built, engineers put this in because
they need to see if things are working properly, right?
They want to capture, did you ordid you not kill that boss at
the end of the level? Right.
Like those things are very critical to know.
There's other things that might not seem important until you
(11:05):
realize they're extremely important.
Things like the session length of a player, if they got stuck
in some on boarding step somewhere, right?
If your controls or your mechanisms or your game design
has something that's too hard for people to pick up or
something that's unintuitive. Those are things that you can
answer with data, but you have to be extremely purposeful about
(11:29):
how you're defining the scope ofwhat you're trying to understand
and how you're going to reflect and look back upon the decisions
that were made. And the unfortunate reality is
that when you're coming up against ship, the first thing to
slip is like, well, you know, whatever, we'll figure it out
when we figure it out or we'll see it in game reviews.
If people can't get through onboarding, you know, we'll hear
(11:49):
from the and there is some pieceof that that's part of this.
But the in game telemetry again often just ends up becoming a
second class citizen because shipping is the most important
thing and it's never unless it'ssomething extraordinarily
important or some financial metric, it's never a block to
ship a game. It's just a really nice nice to
(12:10):
have. It's such an interesting, I
guess, illuminating fact because, you know, from the
outside, looking at somebody who's just a casual game player
not working directly in the industry, you know, obviously
the term AI has existed in gaming for decades and, you
know, referring to NBC behavior.And so my expectation was just,
oh, of course, there's so much data being collected because
(12:32):
they have to fuel these, you know, non traditional actions,
potentially deterministic, potentially non deterministic
based off of, you know, how a player is interacting with the
game and you know, the settings you have applied.
So it's it's almost confusing tothink, oh, data isn't a first
class citizen when it comes to gaming.
So, I mean, you mentioned this pressure to ship as a big part
(12:55):
of why telemetry sometimes kind of gets forgotten here.
Is part of the challenge here that studios aren't really
having the conversation in advance about what do they
actually want to leverage that data for until after the game is
live? Because they're just saying,
hey, we got to ship this thing. Or why do you think these kind
(13:17):
of crucial data strategy conversations are happening
often too late in the process? Yeah, I think that that's a
really good, a good question andthat's a good guess.
I think that there is a lot of data.
It's not always an issue that the data isn't even being
collected. So like that is definitely part
of it. It's like if you're not actually
instrumenting the telemetry, like I said, if it's a second
(13:39):
class citizen and it's not there, it's not there.
But the question then becomes even if it is there, what are
you doing with it? You know, the gaming industry is
world class and a lot of things that they do.
But I will say that the data adoption curve for most gaming
studios is probably behind whereyou would consider most tech
(13:59):
companies in terms of best practices for where the data
needs to go. And that a lot of that has to do
with like every game is so different and a lot of them have
these extremely complicated, youknow, especially for online
games like server client architectures.
How often are you putting data where it's going?
Are you saving all of your data?Are you sampling it?
Where are you putting it? How are you transforming it?
(14:22):
And then, you know, that's just the question of where does it
go? Then you have all these other
questions about how do you transform it and get it business
ready. And those aren't always
immediately obvious answers, especially since the people
handling the data are game devs.They're not business analysts,
right? Like they know what they care
about in their game, but that might not align well with what
(14:45):
finance or marketing cares about, right?
And in order to find a harmony where you have everything
captured that everyone could possibly care about, you end up
in a space where you're doing a lot of trial and error.
And unfortunately in the game space, trial and error is
expensive dev time and time to get things into the, you know,
live build. Like we're talking 6-8 weeks
(15:06):
sometimes, you know, so missing something is very expensive.
And I think it's really just nowbecoming so obvious that gaming
not only as a huge opportunity in terms of the number of people
who play video games, I think weprobably had this idea of the
(15:27):
industry from 20 years ago. We're the only people who play
video games are like, you know, the kids in Stranger Things were
also playing D&D. Like it's like this weird nerd
habit. And I don't think that that is
proven true. I think mobile games have opened
up a whole new world for people.And so when you have anything
(15:49):
that's being massively adopted across the world, you're going
to have behaviors that you don'tunderstand.
Even if you've played your game 1000 times there, it's nothing
like having 1000 different people playing your game once,
right? Like the possibilities, like the
feature space of things people can do.
(16:10):
People break things very creatively or like, you know,
and that's just part of the human experience.
And I think, you know, unless you've done it before, like
these AAA studios are getting a lot better with data 'cause they
have to do it all the time. But it's still been in the past
maybe five years. The data has become something
that's taken more seriously and that has a proven ROI.
(16:33):
But if you're not set up to, youmight be even capturing your
data. If you're not set up to act on
it, then you have another problem.
Like, are you prepared to do experimentation?
Are you prepared to make changesif you see that people don't
like some mechanic? Like are you prepared to just
like RIP something that you likeout of your game based on
feedback? It sounds obvious, but some
(16:53):
people are not willing to do that because they, they feel
like they understand things in away that data can't capture.
So if you don't have a data first mindset of the people who
are looking at it, it doesn't matter what it tells you or if
you're capturing it or if you'reyou're seeing anything 'cause no
one's going to listen to. It you're bringing up so many
good points here. And I think where I want to
(17:15):
start is maybe the most basic example of this idea where we're
always designing for how we think customer is going to
interact no matter the business,whether whether it's it's gaming
or something else, we think, oh,they're going to do this.
And there's always a creative way that they break it and they
do it differently from us. And I want to talk about this
difference between QA ING and testing and games and game
(17:39):
design versus how things happen in actuality as players actually
get exposed to it. But to me, it almost is simpler.
Like there's this very common video, we've probably all seen
it of a cup with little holes for shapes in it.
And it's like, oh, like what, what shape do you put in which
(18:00):
hole? And every shape goes in the
square hole because like, oh, they all fit in there.
It's we don't need to put it in these design holes.
And I always think of this image.
People who listen to me in different podcasts have probably
heard me reference this before because it's just the one that
always sticks in my mind. We talk about this like design
problem of how is the user actually going to use this?
It's like, oh, they're going to find the basic use case, they're
going to apply it and they're going to go, oh, I can spam
magic missile in this game and I'm just going to keep hitting
(18:22):
people with magic missile. I don't need to try these eight
other things so I can just hit them with more Magic missile.
And this comes back to that difference of of QA teams
playing a game versus how the general public plays it.
And like how 1000 weird gamers like myself are going to go
break things in different ways. How does this gap impact the the
usefulness of data collected before a game hits beta or a
(18:43):
full launch compared to once youhave, you know, all these
different folks trying and breaking the game in unique
ways? It's crazy because you don't
realize your own blind spots or your assumptions until they're
tested by people who have, who are coming at it completely like
with brand new eyes, right? A lot of this can be
circumvented if you have like a good user testing framework,
(19:05):
right? So a lot of the big studios will
bring in random people off this literally off the street, like
for play tests to see how they play.
And I've seen some of these playtest areas and they're crazy.
Like they do eye movement tracking across the screen and
see how you interact with things.
And that's very high tech and can get you a lot of the way
(19:27):
there. But it still cannot account for
regional differences, right? Like are you doing play tests in
every market you're in? Like probably not.
Like you try to get good demographic data, but it's hard
to. So like, are your translations
and game landing in some countrywhere something is not a norm
(19:47):
and people kind of don't understand what's supposed to be
like a well understood social interaction that leads to some
mechanic like it's easy to take those things for granted and
it's also easy, Well, not easy. It's hard to truly test the
resiliency of your infrastructure at scale.
(20:09):
So companies will obviously do stress testing of their data
pipelines, of their servers, of all of their architecture to
make sure that they can handle what they would expect at like a
launch capacity. You know, they'll stress test
all their systems and make sure that they're not going to
completely implode. But there's nothing like a
production environment to truly test the things that you've
(20:30):
built. I post this on LinkedIn the
other day. It's it's like a funny joke, but
it ends up being true in more industries than just gaming
where you have users saying like, hey, can we hire more QA,
You know, to test the product before it goes out and the
company just says we have QA. They're called users.
Like, the sad truth is that it ends up being like a lot of
(20:52):
weird mechanics, bugs, holes in the maps.
Like, you know, just stuff that is kind of hard to find doesn't
get caught until you unleash thousands and thousands of
people who are in getting into crazy positions and taking
vehicles where they were never meant to go and just doing all
the wild stuff that a bunch of roving bands of lunatics are
(21:12):
going to do, you know? But as you mentioned earlier,
this also creates A secondary problem of how much are you
actually going to react to that user data.
And it sounds like there's a lotof disparate approaches across
the industry and maybe hurdles that have to be overcome to push
an update and actually react quickly if something is buggy
(21:34):
and broken. It's hard, yeah.
So I mean, you're thinking like a lot of gamers don't realize
this, but let's think about a game that is on PlayStation,
Xbox and PC, right? Like the standard stack of of
platforms to get a game like update, let's say like just on
(21:58):
PC, you know, I worked in security.
Let's say I wanted to do a an update to a kernel driver that's
only on the PC version of the game.
So I don't have to touch PlayStation or Xbox, but I need
to send APC patch out. How often do you think the, you
know, in this case, Microsoft for PC updates?
Like how often are they going tolet me make everyone who plays
(22:19):
my game update just to get a security patch on?
Like can I make people download an update everyday?
What if I have to do it across all the platforms?
You know like how how much of anupdate is worth updating for?
I'm going to ask someone in rural Australia to pay like a
(22:39):
crazy amount because they went over their data cap this month
because I forgot to put a piece of telemetry in the game.
So I have to ask them to download date.
Like there's certain things thatjust are not critical enough to
justify the user friction to push out things like that.
Not saying security, security isa really big one that is
justifiable, but there's other things that are like nice to
(23:00):
haves that sorry, I'm not going to make, you know, millions of
people around the world downloadan update because you forgot to
get a session link telemetry marker in the game.
Like it's just not going to happen.
So like you'll get that data when you get it, but it'll be
maybe in a couple weeks and that's if you can get like a
fast patch out for other things.I mean, we're talking like a
(23:20):
build process. Six weeks usually is the time
frame you're looking like you'relooking at your new content for
a season update. Like you're months ahead, you
are testing early. These build pipelines are long
'cause you have to make sure you're not going to break things
with PlayStation, you're not going to break things with Xbox.
Like they have to all play nice together.
(23:41):
It has to get into their store. Like, you know, this actual
pipeline for development. We have to do our own internal
QA. PlayStation has their own QA for
certain things. Like there's a lot that goes
into making sure a patch is safeand is OK to be put into a game.
And data is usually not one of those things that you can say
like, hey, can we spin up overnight QA resources?
(24:04):
Because I forgot this piece of data.
Like we're going to need to pay people overtime to work
overnight because we have to change this patch and everything
has to be QA tested before it goes out because you never know,
you can put a piece of data in and it could break something.
I've broken games before with data update.
Actually one I did really badly because I'm I flipped a greater
than less than symbol. That was a fun one.
(24:28):
But it's always the simple mistakes.
Oh my gosh, always. So you have to, I mean you have
to test that stuff, right? And so the pipeline for that is
long. And if it's not an immediate
thing that needs to be done because it's a breaking change,
like you got to wait. So from your perspective, what
does it look like for a game studio to actually unlock the
(24:49):
full potential of their data? How can they more effectively
design their games to get the data they need from the start
and then leverage that later? You know, I think it starts with
really the question that everyone wants, right?
Not just in gaming, everywhere. How do I get value out of my
data? Everyone has too much data and
(25:12):
everyone wants to know how to get value out of it.
Nobody wants to just ignore it and leave it on the table,
right? I think in the year 2025, very
few people are like, no, I don'tuse data.
I just go off vibes, you know? So if we start from the question
of how do I get value out of my data if I'm a game studio?
Well, you need to figure out what kinds of questions you're
always asking yourself, what kinds of things that you feel
(25:33):
like you have a good handle on, and what kinds of things you
feel like you have no idea about.
Like when you last shipped something, what were you
completely blindsided by? Was there something people loved
or something people hated that you should have probably seen
coming? Was there something that broke
that you probably should have been prepared for?
These kinds of things will come up the longer you do it and the
(25:55):
more you realize like, OK, like the more postmortems we do, the
more retrospectives we do, the more we look at what every
executive is asking after every single time we launch, we'll
realize, OK, these are the low hanging fruit questions that
we're always trying to like scramble to answer.
And then you start incorporatingdata people, whomever that is,
(26:16):
that could be your data engineers who are handling
getting the data somewhere. That could be, you know, your
head of data analytics, who's responsible for like who's on
the hook for actually answering the questions.
Once the data comes through, it should be all of the above and
you should be including these people in the development
process. Probably pre green light.
Like before every green light session, you should have at
least one debrief about like, how are we looking with
(26:38):
telemetry? Are we going to have the things
that we need to answer the questions that we deem are
important? Like what are we trying to
accomplish with this game, right?
Or is it a sequel? Are we trying to get new?
Like, it might seem obvious, like you want people to play it
right? But I think every game probably
also has some other things they're trying to do.
Are they trying to further some IP?
Are they trying to tell a new story?
Are they trying to reach a new market?
(27:00):
Are they trying to change from hyper casual games to more like,
you know, serious games? Like, what's the studio trying
to accomplish with this game? And then what kinds of questions
do you think we're going to answer?
You'll never get comprehensive. But if you start asking yourself
that early, you won't find yourself caught flat footed.
And I can tell you the worst time to be caught flat footed is
(27:24):
often during your beta because for players now, it used to be
that beta was like, well, if it's, you know, broken and beta,
we'll fix it before we ship, right?
Like, beta is where stuff supposed to be busted.
But what you'll find is that youare asking the questions of your
beta cohort that you're going toask in the live game.
(27:47):
And that's great. If you're like, oh crap, we we
can't actually answer that. That'll happen.
But you shouldn't go in totally blind and say, well, we don't
have any data. These are all the things we want
to know. We won't know because we didn't
have it for the beta. So at least we'll know by the
time we get to live because you can't act on anything you could
have learned in the beta. All you're learning is that you
weren't prepared from a data perspective.
(28:10):
And then players also have an expectation of betas now that
they're a bit more polished, that it's going to set the stage
for how your pre-orders are going to look.
It's going to set the stage for what people's expectations are.
It's going to set the stage for reviews.
And so if you are going into beta, thinking beta is when
we're going to make our data decisions, like you are way too
(28:30):
late. Like even alpha probably like
it's a good time to have those questions if if you've waited
that long, but that might also be too late because it's
depending on how much telemetry you're missing.
You're touching a lot of pieces of code and that's risky.
You know, you really want to be doing this as you're building so
you can kind of de risk it and kind of build gradually as
(28:53):
opposed to like a mad dash at the end.
Yeah, you're, you're absolutely right that there is, I think too
often, not not just in gaming, but many industries, this mad
dash at the end of like, oh, shoot, we have to go fix this
thing that we kind of knew we were going to have to do and we
didn't plan for it necessarily. And as we've alluded to
throughout this conversation, one of the key trends that is
(29:14):
reshaping how we leverage data today in gaming and beyond is
AI. And while AI has existed in
gaming for, as we've referenced things like NPC behavior for for
years, there's obviously a lot of new stuff here with
generative AI. How does this traditional game
AI contrast with the the new generative AI wave and the
(29:38):
updates that have happened to machine learning as it's
exploded over the last 20 years?How are you seeing all these
changes in game data and these improvements in telemetry
intersect with the, I guess the opportunity for gaming studios
from new AI technology? You know, what is so funny about
this conversation about AI is that as a someone who is
(30:00):
involved in both the industry side and now on the vendor side
at Databricks, you know, one of the biggest companies in AI I've
seen. I've personally witnessed these
conversations happening where I'm kind of just a bystander.
And the way that came studios and vendors are talking past
(30:22):
each other without even realizing when they talk about
AI is like hilarious to me, but also like an easy area for me to
make an impact at data bricks because I'm like, hey guys,
here's where we're completely missing the mark when we talked
to them about this. Like I can tell you, you are
having 3 just completely parallel conversations and no
(30:43):
one is actually talking about the same thing and you think you
are, but that's why there's so much friction is like no one's
understanding each other. And that's because of exactly
what you just said. Game studios, developers, gamers
have been talking about AI forever.
We were probably the first people.
Not only did we set the stage for, you know, AI as we think of
(31:04):
it today with GPU's and adoptionthere, we also set the stage for
people to even use the word AII mean.
Maybe not, but like, come on, wepopularized it.
We're the ones that used it colloquially the most, right?
Even calling someone an NPC now is like a meme, right?
Like gamer cultures everywhere. And so when you have a vendor
(31:25):
come to a game studio and say, like, we're going to help you
build intelligent AI, they're first of all thinking in game
AI. That's all they're thinking
about. They're like, OK, so you're
going to make our NPCS not so stupid.
That's great. You know, that's feedback we get
all the time. Like, yo, you know, you're we
love the game, but like, how could it be more immersive if so
and so didn't have like a specific talk track, like make
(31:47):
it a bit more free form. That kind of thing is good.
But the issue becomes when you come to a game studio and you
say, yeah, we're going to make your AI like, you know, do XY
and Z. Even if we're talking about in
game AI, we're going to make it so smart.
It's going to be able to talk tothe player.
It's going to be able to remember everything they did.
It's going to do blah, blah, blah, blah, blah, blah.
(32:08):
All these lofty promises. What, no one except for me to
toot my own horn and the vendor space has been able to tell
anyone is like, OK, well, how are you going to do that?
Like you realize we're resource constrained where to show me
like is the compute to do this in the room with us?
Like how you know, every in gameAI has been like hyper optimized
(32:34):
to run because that's what game companies, that's what gamers
have to have had to do. Going back to Roller Coaster
Tycoon, you got to make this actually work with what you got,
right? And not everyone has the $70,000
NVIDIA GPU ready to go to rock the latest game that you.
I wish I had that right. I know, right, Like we're
dealing with min spec, like the minimum specifications to play
(32:56):
the game. So not only that, but the
question of LLMS within gaming is super interesting to me
because it's like, OK, in games traditionally you have two
choices. You will run something on
someone's client, so like on their PC, on their PlayStation,
whatever it is will be locally computed, rendered, whatever.
(33:18):
Or you'll run it on a server. If you're in a client server
architecture that will be running some data center that
everyone is connected to that server, and then whatever the
server does will talk to the clients and tell them what it
decided, right? Those are your only two options
for where to run things usually.There are constraints on both of
those, to your point. 100% right.
Like normal NPCS run client side, right?
(33:39):
Like they just have to like walkaround the map and whatever.
And some of them are running on the server, like depending if
it's a bot. But if you're playing like an
open world single single player game, most of that's local.
If you're like, now we're introducing, let's say, OK,
yeah, we'll just put an LLM on it, someone's proprietary LLM
that we trained to make the NPCShave cool conversations.
(34:00):
Where are you going to do that? Inference on the client.
You could I guess if someone hada really good GPU.
But what, are you gonna send your trained model to someone's
computer and just give them yourIP?
Model weights for LLMS are worthhundreds of millions of dollars.
Billions maybe. Think of Open AI.
(34:23):
Are they gonna give you their trained model weights for Chachi
PT? That's their IP, but you would
never do that. Realistically have to use like a
local instance of like a llama model for example as like really
the only realist. Like an open source something.
Yeah, like, and that's cool. So then how do you, how do you
integrate into that? How do you make sure that it is
(34:44):
going to respect your IP, that it's going to have maintain the
voice of the character that you like?
Like, does that feel good? OK, well, let's say then you're
going to fine tune it on your own data.
Well, now you have an IP issue again, right?
Because now you're, you're leaking your IP back by giving
it to people to run locally. OK, then you say, well then run
it on the server and just send the information down.
(35:07):
Game servers don't typically have GP us they don't need them
because they're not rendering anything.
They don't have screens. It's a server.
So like, where are you? So the question becomes, you're
telling me you can do this thing, but you're not actually
giving me any ideas for how I can realistically do this.
Now, there's another option. It's not just the server or the
(35:28):
client. You know, you can have something
else in the loop, like we have this all the time.
We have VMS that sit next to servers with the data, right?
Data comes out of the server. Go somewhere else.
Like those aren't the only two options.
You can run things and send themdown to the client that doesn't
come from the server, or you cansend information to the server
to send to the client. That's all reasonable, but no
(35:49):
one's really talking to game studios about that.
They're just saying, Oh yeah, just put an LLM on it.
Yeah, I think it speaks to the kind of common problem that's
happening in a lot of called AI marketing today of just
considering LNLM as a magic bullet instead of a really
useful tool that has constraintsto your point.
(36:11):
And it feels like the opportunity with gaming for the
moment with AI beyond the MPC piece already, which is
obviously been there for quite awhile, is more about like, OK,
how are we leveraging all the data we collect and applying AI
to our analytics? Or are there QA opportunities
with LMS and throwing them at games earlier so that you can
(36:34):
have that user experience of like all these disparate
testing? I'm I'm curious if there's much
happening in those directions sofar.
I see those as being much easierproblems to solve and much more
realistic problems to solve. So specifically with AI with
your data, right? The good thing about modern data
infrastructure, data bricks in particular, but every good data
(36:57):
provider and product suite is offering something similar is
that once you have your data in a place where it is centralized,
it's governed well, right? So like you're legally
compliant, you have everything in, in a place that needs to be.
You have all of your reporting built on top of it, your
understanding your business cases, you're building your
(37:19):
business ready gold, standard gold layer of data like you have
all of your really good data principles handled.
There are tons of products that are integrating directly into
your data warehouse, directly into your data catalogs like
with Unity catalog and data bricks like getting directly in
there and you can get AI natively within the platform,
(37:41):
right? So if you have good enough
metadata, which is just a description of what all your
data actually is, which is actually harder to come by than
you would think in gaming. But once you have good, good
metadata, you can ask AI to almost anything about your data,
right? Like self-service analytics is
here in a big way. Like in a way that even a year
(38:02):
ago I wouldn't have believed it would have been possible and it
had accelerated this quickly. So like we're opening up doors
for non-technical people, peoplewho don't write sequel everyday
to ask questions of their data in plain language where they can
say like, hey, how did I don't know?
How did that skin sell in the, in last week's like, you know,
(38:26):
420 sale, Like we did a collaboration, you know, do
people like it or, you know, what types of skins are people
loving or what was our most popular game mode in the last
week? Did we see any crazy things pop
up on Steam this weekend? You know, have we seen any data
outages? All of that stuff can be
(38:50):
somewhat automated and somewhat self-service, which is crazy and
incredible. So that's part of the AI that
when you go to game studios and you're like, we're going to help
you build AI, they're thinking in game AI.
They're not thinking, oh, I can have an exact self-service.
They're small data questions so that my data team can continue
answering the big ones and putting together the long form
(39:10):
financial reporting, putting together these really intensive
deep dives and experimentation like AB test, causal inference
things that we're doing. And they can just ask the silly
question that they had about like, do people like the new
like, you know, map, like the things that are like
interesting, but you know, kind of derail conversations when
(39:31):
you're talking about longer termanalytics projects.
And then the other one you brought up is very interesting,
which is a bit touches on the NPC conversation kind of, but it
also touches on machine learningand advances in machine
learning, advances in in game bots that we've had going back
probably 10 years. And reinforcement learning is
(39:51):
big here. And I'm talking about how do you
help your human QA team by building more automated QA and
reinforcing all the work that they do with automated QA bots.
And that's also not an easy question to answer because you
(40:12):
have to have a really tight pipeline because basically what
you're doing is you're using your human QA testers to train
the reinforcement learning bots how they do their jobs.
But then every time you introduce an update, the human
testers have to take the lead because these bots don't
understand these mechanics on their own.
(40:34):
They learn from people and they need to learn from QA because
you don't want to have them learn from players.
Play testing, because now you'rebackwards.
You're having players generate the data.
Yeah, the players are lunatics. Also, you don't want your users
to be your testers, right? You need to test the stuff
before users ever see it. And so this ends up being a
really interesting question and an interesting problem space.
(40:55):
I've seen a lot of movement in the like, automated QA.
Don't even know what other people are calling it.
Yeah, just automated QA arena. I haven't seen anything yet
that's like fully compelling to where I'd say, OK, I'm willing
to like smoke test with a fleet of these thoughts and trust it.
I definitely wouldn't trust it for a full QA pass, but I do
(41:20):
think where it's going to help is, like I said, if you have
these crazy, you got to do an overnight test and you have to
just get good enough and you need these bots to run through
along with a couple of humans and you don't have that many
people online. I think that that would probably
be the best use case scenario right now.
We're like, as long as it's not fully breaking something, it has
(41:41):
to go out because it's some sortof critical patch.
But yeah, I think there's going to be more advancements there as
well. You just have to be mindful of
like anything. What does your what is your
pipeline look like? Are there other use cases or
opportunities that you're seeingfor AI in modern gaming today?
Oh tons. I'm building something right now
(42:04):
with a group of people where we're pulling in Steam reviews
and not only are we, you know, leveraging that to try to do
some of our localization. So like, can we train an LLM or
just use a prebuilt 1? Can it understand the
colloquialisms of gaming and canit effectively translate
different languages, understand sentiment of reviews, but also
(42:27):
can you then generate reports based on those reviews,
regardless of the language, but including localization
information so that you know, like, oh, hey, there's something
wrong with like the French version of the game into like
emailed reports for execs. Because that was something that
we saw a lot was we had our community managers online
(42:49):
basically 24/7 having to scroll through Twitter, which like they
should get a bonus just for having to spend time in there
and having to scroll through Reddit, which again.
Maybe a bonus again? Yeah, double bonus.
And having to just, like, scrollthrough all these things and try
to synthesize the information ontheir own.
(43:10):
And the way a lot of that used to work is you'd have your
community managers doing that. And then they'd say, OK, a lot
of people are talking about like, I don't know, the French
version of like Bowser, Like he sounds weird.
And then they'd have to set up listening on their social
listening tools for like French Bowser.
And then they can go through thedata that they're scraping and
(43:31):
find instances of that. But someone has to know that
they need to look for French Bowser to begin with.
You end up in this chicken and egg scenario that you kind of
have to know your problems in order to look for them in your
massive data. And where LLM succeed a lot here
is that you don't have to give it any pre.
You don't have to say, hey, go look for Bowser and tell me,
like where you see an issue here, you can just say, what are
(43:53):
people talking about? What do they care about?
Has there been any emergent conversations that seem
different from what you're used to seeing?
Are there any topics that seem like they're gaining traction?
And then they can say, Oh yeah, people are mad about this French
Bowser. Like he does not sound right.
Honestly I'm just struggling right now cuz I I am in my head
now picturing Bowser with a French accent and I'm gonna have
(44:15):
to go find this after the show so.
I just made it up, but that would be hilarious.
Really. So I'm curious, as we're
wrapping up here, do you see gaming continuing to push AI
forward as it enables, as it hasenabled the GPU revolution or in
(44:36):
other ways in the coming years, maybe it's going to simply add
French and Spanish bowsers and we can experience those for fun.
But you know what other ways is gaming going to impact AI
development going forward? I think, you know, in the ways
that it continues to or has always historically like gaming
is really for me, the the space that I think humans probably
(44:57):
other than like writing. It's like the open world where
humans can be as creative as they want, right?
Like it's where you're the only limit to what you can accomplish
is your imagination. And so as we want to be more and
more imaginative with the world's that we occupy, as we
want to do more and more with our technology, as we want more
(45:19):
and more people to experience things simultaneously online.
Like that all happens in games, right?
You think of like Fortnite, the events that they have, they've
basically built best in class online networking systems.
I've gone to concerts in Fortnite, which is weird to say.
(45:40):
Yeah, no, but like, when you think about the feats of
engineering that it takes to putthose concerts on, you realize,
like nothing would have ever propelled the needs of real time
online networking and multiplayer experiences if it
wasn't for gaming, right? Like, who else would have been
innovating in that space? Zoom.
(46:02):
Like maybe kind of, but they still can't even really handle
big webinars. So like, it's it's gamers who
like they want to do this stuff and they're like, can we now?
No. Can we figure it out?
Hell yeah. So it's that spirit of just
like, this is what I want to build.
This is what I want to create. This is the kind of world I want
to see and we're going to do it.And it's so funny because I, I
(46:24):
see a lot of people talking about, you know, there is this
inherent friction between creatives and engineers where
they're like, wouldn't it be cool if we could, you know,
whatever. And like my, one of my favorite
engineers I used to work with would say like, yeah, if only it
wasn't for the pesky speed of light.
You know, like, like that's always the limit.
Like there's always some inherent friction.
But the good thing is, is that like, the creatives I've worked
(46:46):
with, like their goals are so lofty and not grounded in things
like reality or the speed of light or like, you know, physics
or where the rest of us live that like, they force us to
like, constantly push forward. And while we might joke about
it, like, there's so many thingstoday we wouldn't have if it
wasn't for gaming. And I think it's going to be,
who knows where we're going to end up.
I think it's going to be crazy though.
(47:07):
That is a great note to end on and honestly, for anyone who
wants to see where this all goes, I highly recommend
following Carly on LinkedIn, where she shares so much
interesting content and perspectives on the industry and
what's happening with AI. Carly, I am deeply appreciative
that you join me for this conversation today.
It's been a ton of fun. Where else can our listeners go
(47:29):
if they want to learn more aboutyou and about your work?
You can find me like on LinkedInfor sure and I also write longer
form content on sub stack. Fantastic.
I'm going to have to subscribe to that as well and definitely
check out all the incredible stuff that Carly is doing, both
with GG AI and Databricks. There's a ton of really cool AI
and gaming things coming down the pipe.
(47:50):
And we'll link everything, including Carly's LinkedIn and
all these other links in the show notes today.
So Carly, it's been a distinct pleasure.
Thank you so much for joining us.
Thank you. And to everyone who is
listening, if you love video games as much as we do, maybe
drop your favorite games in the comments.
If you're watching on YouTube, we'd love to hear from you.
Is there a game that got you excited about programming and
(48:11):
technology, or is this somethingwhere you're just kind of
interested and you're following along?
Are you a Stardew value person? Stellaris, That's me.
Mario Kart. What do you what are you
playing? Is it are you Warzone here with
Carly? How are you relaxing?
Honestly I may have to put French Bowser in.
Like I'm not to like, sneak a French Bowser.
To find him with a little beret.I mean, this feels I will say if
(48:35):
it's probably already a thing that I can just go look up, but
if not, it's absolutely so I'm going to have to generate from
AI. So here's an AI use guys.
And while you're leaving that comment, make sure to like,
subscribe and follow Carly on her sub stack and or her
LinkedIn. Thank you so much for tuning in
a chain of thought. We'll see you back here next
week. And Carly, thank you again.
(48:56):
It's been great. Thank you.
Bye.