Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome to Mediascape
insights from digital
changemakers, a speaker seriesand podcast brought to you by
USC Annenberg's Digital MediaManagement Program.
Join us as we unlock thesecrets to success in an
increasingly digital world.
Hi everybody, and thank youvery much for joining us this
(00:24):
week, as we are joined by a verydear friend.
Colleague, a friend of theprogram started off as an
industry advisor foriascape.
Today, I'd like to introduce DrPhilip Garland, who is a
(00:46):
leading survey methodologist anddata scientist.
He's worked in multipleindustries, but with a special
focus on politics, culture andhigher education, and he's been
doing that in multiple spheresand in multiple ways for as long
as I've known him, which is, atthis point, about what?
25, 26 years.
(01:07):
So, phil, thanks so much forbeing with us today.
Speaker 2 (01:11):
Pleasure to be here.
Thanks for having me.
Speaker 1 (01:13):
I want to jump right
into it and ask you about your
journey through education andthat evolution that happened.
It began with communication andthen evolved into some pretty
interesting ways, so could youtake us through that to begin
with, and then we'll start totalk about your career and then
some pro tips that you havearound surveys and data.
Speaker 2 (01:33):
Yeah.
So I like to describe myjourney, my career journey, as
kind of a happy accident, andthen the only reason why that
happy accident is possible isbecause I was led I led myself
by what I was good at and what Iwas passionate about.
And so the very beginning ofthe story starts with a college
freshman who had just moved toSeattle from Los Angeles, where
(01:56):
his high school was extremelydiverse and lots of
multiculturalism, to a placethat was more homogeneous.
So I came in thinking that Iwould follow in the footsteps of
my father, my stepfather, who'san engineer for Boeing.
My dad is an airline captain.
So I thought aeronauticalengineering, or maybe computer
science which at the time in1998, was just really kind of a
(02:20):
budding field compared to whatit is today, and when I showed
up on campus I unfortunatelyexperienced a great deal of
racism, racial bias, prejudice,and it was in that moment, in
those moments, I decided that Iwanted to study that, I wanted
to impact that, and so Iswitched to political science
and communication, becausesomething about our kind of
(02:42):
mediated interactions wasplaying a role in how people
thought of each other, howpeople thought of me.
The most poignant illustrationof this is really almost every
party I went to or socialgathering, people would say hey,
phil, nice to meet you, whatsport do you play?
And so then I knew it was amatter of exposure and education
for those folks, and so Ireally wanted to study that and
(03:06):
impact that.
So that started the journeyinto political science.
I was fortunate to have someincredible mentors, some people
that really, from the very firstmoment that we met some
professors that were interestedin investing in this kid, and so
I stuck to their sides.
(03:27):
I took every opportunity thatthey offered me.
I hung around their officehours.
I dropped into their officeswhen I was walking by them and I
, you know, spent a few minutessharing about what was happening
in my life and what I wanted todo.
And they, you know, asprofessors often do in my life
and what I wanted to do.
And they, you know, asprofessors often do, they're
receptive to anyone who has thegumption and the chutzpah to go
(03:49):
and do something like that.
And they took me under theirwings and that made all the
difference.
Especially at a largeinstitution like University of
Washington, there's tens ofthousands of students.
It can be hard to get that kindof one-on-one and personalized
interaction.
So I just signed up for theirclasses and all the rest that I
could having to do with Americanpolitics, race politics,
(04:12):
political communication, massmedia effects on society and so
forth and not only had awonderful experience, I started
getting better grades and it wasreally just something that was
fulfilling for me, meaning I wasgood at it, I enjoyed it and I
was rewarded for it, and so thatkind of positive feedback cycle
is the kind of thing that cankeep propelling someone to be
(04:36):
great.
So at the end of college Ithought, you know, I wanted to
get into politics, and so Iapplied to law school.
I only applied to three schools.
I didn't study for the LSAT asmuch as I should have, and I got
into my third choice.
And so I mentioned this to mykind of the main advisor that I
had, david Domke, in the springof my last year of college and I
(05:00):
had mentioned to him just inpassing that my mom said,
because I was 20 at the time andabout to finish school, that I
should apply to a master'sprogram, and you know, kind of
by myself, two years at a time,so that I wouldn't finish law
school as a 23-year-old, whichwould be probably hard for
someone to hire Like who wantsto hire a 23-year.
(05:21):
He said your mom's right, takethe GRE.
If you get above 550th section,we'll give you a full ride for
a master's.
He got the scores and then thefall of 2001,.
9-11 happened and as a resulthe got three peer-reviewed
academic publications concerning9-11 out right away and another
book chapter.
So at this time now I'm justturning 21 with four
(05:44):
publications and still intend ongoing to law school.
And so in the second year of mymaster's program, a political
science professor named AdamSimon we're in the elevator
together and at this point I hadlectured in our largest lecture
hall like five, 600 students Ihad hosted a freshman interest
group kind of mission projectspoke in front of 3,000 freshmen
(06:06):
in the basketball arena.
So folks had kind of heardabout this young kid who had the
publications, who happened tobe African-American, and so he
was asking kind of in a leadingsort of way, like where are you
going to get a PhD?
And I said I'm going to go tolaw school.
And he's like law school, youshould get a PhD from Stanford.
And that put the idea in myhead that was kind of like happy
(06:27):
accident number going toStanford.
But you really only canunderstand the level of
intelligence but also the workethic that folks have at a place
(06:54):
like that, meaning their needfor cognition and their effort
and thirst for information,education and exposure is almost
unquenchable.
And so when you get exposed tosomething like that it raises
the bar for yourself.
And so we all at everyuniversity, we all sort of teach
the same.
You know seminal works from thesame kind of pioneering
(07:15):
professors for the cornerstonesof whatever field that we're in,
and so that's kind of allstandardized for the most part.
But what happens is your peersstart exposing you to higher
levels of achievement, output,production effort etc.
And so that I would say, waskind of the biggest impact on my
(07:36):
life was just, you know, theother grad students and even the
undergrads that I was a TA for.
That really kind of just showedme a new level of greatness.
So my chair, or my advisor forthe PhD was John Krosnick.
John Krosnick has received theLifetime Achievement Award from
the American Association ofPublic Opinion Research and I
(07:57):
would say he's sort of on theMount Rushmore of scholars in
survey methodology.
Now I think sometimes when Itell people about my background
in survey methods.
They're kind of puzzled thatsomeone could, that you would
need a PhD at this level forsomething as simple and
straightforward as askingquestions.
Right, we ask questions everyday.
(08:17):
Hey, do you want to get someMexican food for dinner?
Like that's a question.
Our life is built around askingother folks questions, but there
is absolutely a science to itin terms of question wording,
the number of response options,sampling and then the applied
statistical analyses tointerpret the data as well on
(08:46):
political attitudes and, withinthat, to some degree, kind of
racial attitudes or intergroupattitudes, so the attitudes that
people hold towards one anotheron the basis of some
characteristic, generally racebeing one of them, gender kind
of different characteristics,and so this was kind of the
melding of my interest backgoing, dating back to undergrad
at University of Washington, interms of race and race politics
(09:07):
and racial attitudes in America,and so that really equipped me
with a tangible skill set thatled to, you know, a career in
the survey industry ultimately,and so the trouble, though, is
that this industry has existedseparate and apart from kind of
(09:28):
academic scholarship for so longthat it can be hard to convey
the amount of information andtechnical details in something
like this when, again, peoplethink question asking is
something that's intuitive, thatwe all do.
So it's nice to be equipped withthis skill set, but the
challenge in the corporate worldis really educating others in
(09:53):
the corporate world and theirclients, constituents and
customers that there aredifferences that make a
difference in terms of how weask questions and analyze the
results and the responses tothose questions.
So definitely, you know,professor Krosnick and the
others on my committee reallyequipped me for that journey.
Speaker 1 (10:13):
All right, fantastic.
And what was your dissertationabout?
Speaker 2 (10:17):
My dissertation
studies themselves ask people to
evaluate a set of music lyrics,except that I told them that
the artist who produced thelyrics was either black or white
and was either a rap artist ora rock artist.
So the prevailing scholarshipand literature in the previous
(10:38):
50 years concerning, you know,racial attitudes and prejudice
would predict that the blackperson and the black thing rap,
so that is the black rap artist.
The lyrics would be perceivedto be most offensive when the
person was either black or rapor both, and so that was kind of
the hypothesis going in, andwhat we found was the almost the
(11:01):
exact opposite, which is thewhite rap artists and the black
rock artists were the top twomost offensive perceptions.
And so what that meant from atheoretical perspective is that
when you cross over into a genreor a cultural domain that isn't
(11:23):
owned by your group, there'skind of a punishment effect.
And so this really kind of flewin the face of, you know,
decades of scholarship and wassurprised to pioneers and titans
in the field who told me sothat this was just, you know, an
exceptional discovery,essentially because it really
(11:43):
helped drill down andcrystallize, like, what the
drivers are, what the mechanismsare for racism.
So when you think about kind ofracial strife and the civil
rights movement, right, itultimately meant that people
perceived black folks asencroaching on white domains.
Oh, you want to integrateschools?
Schools are for white people.
(12:03):
You want to?
You know, ride on this busBuses.
You know the front of the busfor white people.
So it starts with thisconception, or this
misconception, that some groupsdon't belong in certain places,
and then we have this kind ofprimitive flight or flight
aversion to things that are outof place.
And so when you think about allof the advancements for women,
(12:27):
minorities and other domains,it's because they're trying to
enter into a space where theypreviously were not.
Speaker 1 (12:35):
That gets into
ecosystem politics and a whole
bunch of other things, alongwith what we've discussed in the
past, that idea of like culturefit versus culture additive.
But maybe we can circle back tothat a little bit later.
Very, very interesting, phil.
Before we jump into, you know,one of the early real successes
in your career, I just want toask about survey methodology,
(12:59):
and to you this might seem likea foolish question because
you've lived in this world, inthis realm, for so long, but for
some others they might be going.
What's the significance ofsurvey methodology?
We can understand why surveysare well, it's important to
survey people and to surveylarge groups of people.
(13:20):
However, the concept of surveymethodology why is that so
significant?
Can we talk about what you knowwhat it is?
Why is it so significant?
Speaker 2 (13:42):
Yeah.
So the word methodology is keyhere, because methodology as
opposed to methods means it'sthe science of the methods.
That means we're experimentingwith the methods to understand
what differences are made bysubtle changes in the methods.
So one of my favorite examplesof this was a study that
Professor Krasnick and I did ingraduate school with some other
(14:03):
folks having to do with a simplequestion, asking people how
many movies they'd seen in thepast month.
Pretty straightforward, right?
Maybe some big productionstudios or movie theaters.
You can imagine who might beinterested in a question like
that Netflix, right, formerlyBlockbuster.
There are lots of people thatmight care about how many movies
people are seeing in a month.
(14:24):
How many movies have you seenin the last month?
And the other half were howmany movies have you gone out to
?
So you can see that, like, goneout to is a, which should be a
lower number than seeing a movie, seeing one in passing, saying
part of a movie, right, and soit's true, gone out, did have a
(14:44):
did yield a lower number ofmovies gone out to then seen.
So that was pretty basic.
But within those kind ofbifurcated halves, we asked
seven different ways During thelast month, how many movies have
you seen, if any, whichdepresses the number Because it
tells the respondent that it'sokay to so forth.
And I think some of whatthey're getting at is this
(15:07):
concept where subtle changes inthe language can produce
statistically significantdifferences in the results.
So another version is, you know, a lengthy, wordy version.
Some people never see movies.
Some people see movies all thetime.
(15:28):
How about you?
How many movies have you seenin the last month?
Right, so you're kind ofnormalizing again that zero
movies or very few movies is aperfectly acceptable answer.
And so when you think about themultiple iterations for a
question as simple as moviewatching and then apply it to
something a lot more complexwhere people have low
information, low experience andlow starting points for whatever
(15:51):
the topic is, those subtledifferences make a huge
difference in the outcomes andthe results that we see I have
so many follow-on questions Iwant to ask and one of them I'm
just going to ask it, eventhough I know we need to get
into some of our other thingsbut it's the danger of binary
thinking and living in thisworld of black and white, male,
female, republican Democrat.
Speaker 1 (16:12):
You know all of these
different things, and
especially as we begin to thinkinto the sophistication of
survey methodology and thensurvey responses and how to
weave through all of that in aworld here, as we said, in 2025,
that seems like oddly, we'reretreating back into a
non-multidimensional way ofthinking, but back into these
(16:32):
binary ways of thinking.
But I just want to throw thatto you because I know that
that's something that matters alot to you and as it relates to
surveys and collecting data,this idea of a binary approach
to data collection and then dataabsorption, data research
strategy, et cetera.
Speaker 2 (16:50):
Yeah, I mean in terms
of binaries.
There's actually a very durablefinding in survey methodology
called acquiescence responsebias.
So most people and in certaincultures this is more drastic
than others there's a tendencyto be polite and agreeable and
respectful to say a researcheror a person in a lab coat or
someone in a position ofauthority, and so if you ask
(17:12):
someone during the last month,have you seen a movie?
Yes or no, you're going to getan inflation of yeses on its own
, compared to a numericaltransformation of that where
zero is no movies and anythingabove zero is a yes right.
If you were to kind of collapsethe numbers into a binary, you
would get different results.
When you give people theopportunity or give them a clue
(17:34):
that saying no is okay, so yes,no questions in the survey world
are extremely problematicbecause you're going to get an
inflation of yeses that isn'tvalid and doesn't reflect the
underlying phenomenon thatyou're trying to study.
Interesting.
Speaker 1 (17:47):
All right.
So anybody who has been aliveand working within business in
the last couple of decades hasprobably stumbled into
SurveyMonkey a time or two,either designing a survey or as
a recipient of a survey orrespondent to a survey.
It's one of the most ubiquitoustechnologies that we have, and
(18:11):
you were heading up surveymethodology at that organization
, which is really impressive,and I remember when you got that
job and I was floored.
That's just so impressive.
How did your work thereinfluence the product and the
business during your time atSurveyMonkey?
Speaker 2 (18:28):
Yeah, so when I
joined in 2009, the business was
actually 10 years old at thatpoint, but it was still pretty
rudimentary in terms of itscapability to handle
sophisticated questionnairesthat would be produced by
academics and so forth from theworld that I had come from.
If you think about it intoday's terms, it would be a
(18:50):
little more than Google Forms,right Like?
You can write a question, youcan get some radio buttons or
some response options, andthat's about it.
But a sophisticatedquestionnaire has lots of
elements to it, likeexperimental manipulations,
where someone's going to getversion A of a question, like
the movies question, orsomeone's going to get version B
, and so it needs to randomlyassign people to the version.
(19:12):
You need to have skip logic.
It's like you know what is yourgender or gender identification
.
If your gender or genderidentification is male, we're
going to skip the questionsabout pregnancy, right, and so
that way we're not asking youtoo long of a questionnaire that
doesn't apply to you.
So you need skip logic.
You need sometimes to changethe order of the response
(19:32):
options.
There is a writtenquestionnaire is what's called a
primacy effect, meaning peopleread top to bottom or left to
right and kind of stop when theyfind a satisfactory answer.
That phenomenon of not tryingtoo hard or giving a BS answer
is called satisficing, and sothere are elements that you can
insert into the questionnairedesign to mitigate satisficing,
(19:55):
essentially.
So, step number one forSurveyMonkey at the time I
joined, which again was 10 yearsold but about 30 people, was to
create a tool that everyonecould use your DIY parent group,
your dentist, but also a verywell-trained survey researcher.
So that way we essentiallyexpanded the addressable market
(20:19):
from, you know, kind of novicesto experts in that way.
So step one was to manipulatethe tool, because everything
downstream that we wanted to dowith the business depended on
kind of drawing in a moresophisticated customer base.
The second objective was tocreate what's called a access
panel or a double opt-in accesspanel, which is when people
(20:41):
voluntarily sign up to takesurveys for some sort of
incentive or reward Could becash, could be a gift card,
could be points to buy an object, could be sweepstakes, so many
things.
And this was a fairly maturemarket at the time, although it
was undergoing someconsolidation, because it
(21:04):
happens to be a market that hasthe structure of a monopsony,
meaning many sellers, one buyer,and so that creates a
commoditized business whereeveryone's trying to sell their
survey responses for thecheapest price, and so the
margins are really tough, whichmeans that any element of kind
(21:25):
of inefficiency is very costlyto the providers or producers of
online survey interviews.
And so, for example, if a bunchof people quit your survey,
right then are you payingincentives to those folks who
didn't finish.
If a bunch of people aresatisficing straight, lining
abracadabra, right, mucking upthe data, creating a bunch of
(21:47):
noise, and you have to replacethose, right?
So the art and the science inthe kind of panel industry is to
get the largest and mostdiverse pool possible and to
have them take the surveysregularly, take them seriously,
which is a lot harder than itsounds.
So SurveyMonkey wanted to enterinto this business because it's
(22:10):
DIY survey customers maybewanting to test a new logo or a
color or something right.
So we have this experimentalmanipulation, or understand the
landscape of a new businessventure, or do competitive
analysis Tons of reasons why itsexisting survey creators would
need a pool of respondents to gowith that, to supplement that
(22:32):
survey work, without having togo stand on the street corner
with a clipboard or at the timethere was really no social media
to post on.
So it's like where do I getpeople and preferably a
representative group of peopleright so that I know what all
Americans think, or allCalifornians think, or all of
the certain group things?
So it has to be generalizable,has to be valid, has to be
(22:52):
reliable, and so we set out tobuild this business and we did.
Fortunately, we're doing around2 million survey respondents a
day at that time and so we justat the end of their again their
PTA, their dentist, optometristsurvey we just asked them if
they wanted to stick around orsign up to take more surveys,
and a substantial number ofpeople did.
(23:14):
We were able to build thelargest pool of people in the
industry in about six months andit became a very significant
portion of our revenue becausewe had this existing base of
survey creators.
But the third piece related tothe second, which is proving
that the pool of respondents oravailable respondents that you
have is representative ofAmerica essentially in this
(23:37):
context.
And so fortunately, there iswhat I like to call the Survey
Olympics every four years andit's called an election, which
means there's a right or wronganswer and you can kind of
submit your horse into the raceand to say you know how
representative is this panel andto the extent that it can
produce valid predictions of anelection, that suggests that you
(24:01):
have a very healthy, hygienic,suitable and usable panel before
you.
So the 2012 presidentialelection we did just that, but
we did it with a slight twistbecause election polling is very
expensive.
You'll see the sample sizes forthese election studies
generally around 1,500,sometimes 750.
And that's because you're goingto get a very predictable
(24:25):
margin of error at a predictableconfidence level At about
1,500,.
There are diminishing returnsfor collecting more people,
which are really expensive forvery small gains in the
reduction in the margin of error.
So our advantage was, at 2million surveys a day, is we
were touching about 90 millionhouseholds, which was just shy
of half at that point every 90days on SurveyMonkey, again,
(24:49):
because we're DIY for churchesand dentists and all the things.
So everyone has some placewhere someone was kind of doing
a DIY survey.
So we had great coverage, whichis the term that we use for
understanding the ability forpeople to make it into a sample.
Traditionally it had been firstface-to-face and then telephone
(25:11):
, and now we're in a place wherekind of internet access was
virtually ubiquitous.
You know, 95%, 90, 95% internetpenetration in the United
States at that time.
And so the argument aboutcoverage is hey, we think we
have coverage that is, you know,similar to that of a phone or a
face-to-face opportunity.
There's a ton of debate aboutthis which I won't go down into
(25:34):
the weeds to, but ultimately theidea is we had a great coverage
.
So the next question is whatare our response rates?
In 2012, the response rates fortelephone surveys were about 9%,
which was really 15, 14, 15%for landline telephones, if
anyone remembers what those were, and it was about 5% for mobile
(25:55):
phones, because people don'tlike to answer their phones and
use what were then expensivecell phone minutes, and also the
law mandated that you could notuse an auto dialer for cell
phones, meaning a human had todial the phone, dial the numbers
, and therefore more expensive,and so it was really expensive
to do election polling and it'salmost impossible to do it at
(26:17):
the state level repeatedly.
So you'll see things like youknow.
Des Moines Register, iowa.
Des Moines Register.
Does you know polling for Iowa?
Right, and it does itinfrequently.
But Iowa is a very criticalstate in the primary season,
right, and so Circuit Monkey hadthe opportunity to do all 50
states very cheaply because wehave tens of thousands, if not
(26:38):
hundreds of thousands of peoplein each state.
So we did about a million twomillion survey responses in the
12 weeks leading up to thepresidential election.
We didn't do too much in placeslike California, where the
outcome was obviouslypredictable, but the 10, 11
battleground states whereelections are won and lost.
(26:59):
We're doing surveys daily andproducing the results weekly
leading up to the election, andwe got 48 states right.
The two that we missed werekind of a rounding error on the
margin of error, which I thinkcritics of online polling will
love to jump on.
But all pollsters have dozens,if not a hundred, kind of models
(27:21):
for weighting the data beforethem and they ultimately the art
meets the science, where youhave to pick one of those and
you know, say, that's yoursubmission into the survey
Olympics.
So we certainly had modelswhere those two states were
correct, but maybe it flippedsome other ones incorrectly,
because you can't really cherrypick.
You want to apply yourweighting scheme evenly and so
ultimately we're very proud ofthat.
(27:42):
We were featured in the NewYork Times because of this and
really I think kind of set thestage for what became the
legitimacy of online pollingthereafter, which is very
commonplace these days.
Speaker 1 (27:55):
All right, fantastic,
pretty amazing, that you were
involved in something that wasgroundbreaking like that, and
congratulations.
We think a lot about how,especially within higher ed, so
much of what we teach and whatwe share and this will dovetail
into our discussion about AI alittle bit later is actually a
synthesis of what has beenshared with us and, when there's
(28:18):
the opportunity to create aspecialization, as such that
you're able to make acontribution to a body of
knowledge that's new and notjust a synthesis of what's come
before, that's remarkable, ofwhat's come before that's
remarkable, switching gears intoa little bit more of what it is
that you do.
Now there's three questions Iwant to ask you to get us
(28:39):
started, and so we'll take themone by one.
The first one is could youshare with us a pro tip for
surveys and also biggestmistakes that people sometimes
make?
Speaker 2 (28:51):
I'll say one in the
same here and this may ruffle
some feathers in some circlesbut ubiquitous and very commonly
used Net Promoter Score, or NPS, is not worth the paper it's
printed on.
It will lead organizations towrong and wrong-headed decisions
.
If you'd like, I could show youa diagram of this.
(29:12):
But ultimately I can show you adiagram of this.
But ultimately I can show youthree different distributions of
data that would lead you tointerpret very different
business conclusions, businessstrategies.
Out of them, one kind of lookslike the Grand Canyon, where
you've got like a kind of likedata on the very ends, kind of
like a polarity on the ends andnothing in the middle.
(29:33):
You can have some with like askew to the left where folks are
.
You ends kind of like apolarity on the ends and nothing
in the middle.
You can have some with like askew to the left where folks are
, you know, kind of clumped onthe higher end.
And you can have some that areshifted toward the middle a
little bit, but still kind of ona normal curve.
All three of thosedistributions will give you the
same NPS score.
So how can we make a decisionas a business when extreme
(29:57):
levels of variance anddispersion yield the same number
.
And that's where I say just usethe tried and true mean median
standard deviation frequenciesof your distribution to make
sensible decisions.
We don't need a magic bullet.
This one question that's goingto solve somehow solve all of
(30:18):
our problems.
From a question wordingperspective it's double-barreled
because it asks you if you'drecommend to a friend or a
colleague.
There are certainly productsthat I would recommend to my
friends and not my colleagues.
I won't go into what thosesensitive things might be Could
be athlete's foot, I don't knowbut the point is friends and
colleagues are different groupsand they should be separated.
It's on an 11-point scale.
We suggest five or seven-pointscales that are fully labeled.
(30:41):
There aren't labels on allthose scales.
So there's a paper by ProfessorKrosnick, daniel Schneider and
some others that goes into this.
You can easily find on the webif you search for Krosnick net
promoter score.
You can easily find on the webIf you search for Krosnick net
promoter score.
There's a lengthy paper thatexplains why it's problematic.
But the summary here is you canget the same number from very
different data and that's notthe point of data.
Speaker 1 (31:04):
All right.
Second question AI.
It's on everybody's minds thesedays.
How is AI going to impact thesurvey industry?
Speaker 2 (31:12):
Good question For
some menial tasks much of data
work and survey methodology work, before you get to the big aha
and the finding and everythingthat you're looking for.
There's a ton of data cleaningright.
There are outliers, there aremistakes in data entry, there
are satisficers.
There's a ton of value in whatAI can do to clean data and get
(31:34):
it prepared, coded, standardizedand in a position to be
analyzed.
That takes probably 80% of thetime in terms of a data process.
So it's definitely going tospeed that part of the
methodology up and really removesome of the tedious work
(31:55):
involved there.
I think the second part is theanalysis.
There have been statisticalsoftware packages dating back
Most people, I think, in thisera Remember SPSS.
There's R, there's Stata andmost recently there's Python,
and so these are all kind ofscripting languages to basically
(32:15):
undertake the statistical test,the regression right.
Whatever the applied statsmethod at hand is.
That used to require a lot ofcode, where now you can
literally just put in thedesired test into ChatGPT or any
other kind of program and itcan do that for you.
I just tried it the other day.
I put in my dissertation dataand gave it some very minimal
(32:38):
commands and it was very good atdoing that, so that data were
obviously already cleaned andready to go.
But so I think from thecleaning part and the analysis
part, ai is going to beinstrumental in saving everybody
a bunch of time and headacheand hassle with parts of the
work that you know aren't asinteresting.
Where I think AI cannot have animpact is if you want to survey
(33:00):
1,500 Americans about theelection.
No amount of data simulationcan do that for you.
You literally still have to goask certain questions to
individual human beings to gettheir opinions.
So that part I think is safefrom AI.
The asking people, real people,real questions part I don't
think can be simulated, so it'sinsulated to some degree but
(33:24):
will be sped up and made moreefficient in the interim.
Speaker 1 (33:27):
And as we think about
politics and you're teaching in
our digital media program, andwe think about new media, all of
the new forms and platforms ofnew media, how is new media
affecting politics within thisworld of surveys and data?
Speaker 2 (33:43):
Yeah, I would say far
and away the biggest factor
that is affecting our politicsis the bifurcation and the
isolation and the tailoring ofmedia.
Your feed, your friends' feeds,even just take Fox News to CNN
(34:03):
or CNBC, whatever you like, thiskind of me-centered media
consumption, which was an ideapioneered 30 years ago by Kat
Sunstein at Harvard, and it'sreally when you only seek out or
are served information thatcorroborates and reinforces your
opinion it is very hard toundertake politics, which itself
(34:28):
is a competition for scarceresources.
But if the information aboutthat scarcity or those resources
are completely different,there's no way to have a
conversation about it.
So it becomes.
That's why you've seen morepolarization, more tribalism,
more separation, Piloing,Piloing exactly Because this is
(34:50):
the structure of our media, ofour media landscape.
You know, back when it was ABC,CBS, NBC, Dan Rather, Tom
Brokaw and Peter Jennings.
They're largely we're workingfrom the same set of guardrails,
the same education, the sameinstitutions, the same kind of
historical perspective.
But now the 15-second TikTokabout the latest conspiracy
(35:16):
theory is filling that void orhas taken the place of those
kind of standardized options.
And when those aren't vetted,when those aren't questioned,
and unfortunately our brainshave not really evolved faster
than the technology and so we'revery poor at discerning truth
(35:36):
from fact, from opinion, fromtruth from falsehoods and so
forth, compounding what is kindof a legacy primitive brain, the
effect on a legacy primitivevein, with a technology that,
under Moore's law, is doublingevery number of years, and I
wish I had a brilliant solutionto kind of get us all back on
(35:57):
the same page.
But that's the biggest effecton our politics, which is
creating more tribalism, moreseparation, more angst, more
polarization, more anger, morehurt, more sorrow.
Speaker 1 (36:11):
Just to comment on
that.
I have a filmmaking andphotography background and we
talk about a frame.
You know, the frame of thephoto that you pick right Could
be this big.
And then there's thatexpression what's your frame of
reference?
And people, people think aboutoh well, that's just the context
I'm coming from, it's theperspective that I have right,
but if you think about what itreally means, a frame means that
(36:31):
you're able to see this muchand when a photographer or
cinematographer hands theircamera at something, they pick,
what's in that frame?
Mostly what happens is what youleave out.
So, when you talk about yourframe of reference and a media
perspective in a frame ofreference, even if we're talking
about algorithms, algorithmsare created by people who have
(36:53):
explicit and implicit agendasand bias, often, or almost
always, driven in some way bymoney and resources and stock
prices and just trying to holdon to their jobs.
And if you think about a frameof reference, it's really
important.
And and if you think about aframe of reference, it's really
important scary when you thinkabout all the things that mostly
what is being left out is farmore than what is being put
(37:15):
inside of that frame.
So, for anybody who's listeningand for all of our students.
At DMM we talk about that awhole lot and, of course, here
at the Annenberg School forCommunication and Journalism,
you know.
I want to ask you just a couplemore things, phil, and one of
them has to do with yourbackground as an entrepreneur.
You've been a founder ofmultiple companies.
You have been on the foundingteams of multiple companies or
(37:39):
consulting with multiple startupcompanies and I want to ask you
you know so many entrepreneurswhen they are thinking about
starting a new company anddeveloping a new product.
You know it comes from this,this idea of gut and instinct.
You know, I know that I woulduse this if it was around.
Here's this problem that I'veseen within my world and sphere
(38:00):
of experience and influence andit's very much like I would use
it, or it's very emotional.
You ask a few friends but what?
Your approach is much lessqualitative, much more
quantitative.
So can you talk about thequantitative side of being an
entrepreneur and how that hasinfluenced the development of
products?
Or maybe even when you try todecide whether or not to hit, go
(38:23):
on an idea that you think hassome merit and are you going to
put a whole lot of time into itand try to build a company
around that Qualitative versusquantitative in your opinion and
in your background as anentrepreneur.
Speaker 2 (38:34):
I think step one a
venerable VC once told me is
understanding what frictionyou're trying to solve.
And sometimes, when you have agut instinct about friction,
there's some friction in yourlife.
Ebay was famously started bysomeone who wanted to be able to
sell.
His wife wanted to be able tosell PEZ dispensers trade PEZ
dispensers right, so there'sfriction in like getting these
(38:54):
collector's items out to themarket right, so that starts
with a personalized friction.
But then the second and mostimportant step is to really
quantify the opportunity or theaddressable market for that
friction.
How much of that frictionexists in the world and how much
of it can you solve for howmany people?
And that's where kind of surveymethodology and these double
(39:14):
access panels can come in and behelpful because you can really
get a sense of is this a problemthat me and a few other people
have, or is this a problem thateveryone has?
Is this a problem that we haveevery day or is this a problem
we have once a year?
Right, and so when you start toanswer those questions, then
you can understand what yourprice should be, you can
understand what your productshould be and then you can
understand what your productmarket fit is, which is you know
(39:35):
how well your product addressesthat friction.
And I think it's a veryunderutilized step in the
entrepreneurial process wherepeople could basically collect a
survey from 1,500 people andunderstand their market or their
desired market At the same time.
(39:56):
You know the Henry Ford's famousquote of if I had asked my
customers what they wanted, theywould have said faster horses.
So it's this pull between youknow asking people about
something that they don't want,right?
I mean, we all had flip phoneswhen smartphones came out and
Steve Jobs himself famouslydidn't want to do the iPhone
right.
(40:16):
So it's always going to be anart and a science to some degree
.
But you can help yourself agreat deal by at least
understanding how many peopleyou're going to have to educate
about this new technology thatyou think they need, right.
So either way, you're going tolearn something about how to
educate them or how to make theproduct so that it fits within
their friction and so forth.
But ultimately, surveymethodology and asking a you
(40:40):
know, target audience of peoplewho could be your customers or
just the general population isgoing to get you a long way in
understanding the friction, theamount of friction and how
you're going to solve it.
Speaker 1 (40:52):
It reminds me of
something that I learned about
through the lean startup method,which was pioneered by Harvard
Business School, and one of thebig questions was you know, as
an entrepreneur, you need to askyourself, first and foremost,
not can we build this product,but should we build this product
Not?
Can we bring this to market,but should we bring it to market
(41:13):
?
Is there a product, market fit,before we go and spend a whole
bunch of time, money, resources,et cetera on something that
probably shouldn't be developedin the first place, even if it
feels like a wonderful idea.
Okay, last question, thequestion I always love to ask
when we have our our wonderfuland esteemed guests, and it's
it's your opportunity to shareone piece of advice, and it
(41:36):
could be something about life,it could be something about
professionalism or academia, butif you had one piece of advice
or a core value or a North Starthat guides you that we could
gain some enlightenment from,I'd be really grateful to hear.
Speaker 2 (41:51):
Yeah, I think, dating
back to my happy, accidental
journey of a career, what Ilearned and tried to impart for
folks is, whatever level ofeffort you think you're giving
your max effort, there's a gearand a level past that for
yourself and certainly in theworld.
Meaning someone out there isdoing your level and then some,
(42:13):
and maybe not even breaking asweat, because they've already
pushed past your highest leveland trying to push them past
their own highest level, whichis creating an even bigger gap
than you think, their ownhighest level, which is creating
an even bigger gap than youthink.
So it's hard to fathom that.
There is another level whenyou're exhausted, when you're
busy, when you have a ton ofthings going on in your life,
(42:34):
but it's not hard for someoneelse to fathom and those are the
folks that you're competingagainst.
Michael Jordan used to say heused to practice so hard and so
forth because he knew Larry Birdwas getting up extra shots more
in modern times I've heard thisstuff Curry needs to make 250
swishes of threes after practice.
So already tired from practice,already exhausted, ready to go
(42:56):
home.
Another level past that and itshows in the output on the court
, and so life is kind of likethat you will get out of it what
you put into it.
But what you think you'reputting into it probably isn't
probably the most on earth,because you've never seen the
most on earth.
So you kind of have to imagineit.
You have this boogeyman, thisfictitional character out there
(43:16):
that's getting 300 swishes.
After practice.
That person probably exists,and if they don't, you'll be
that person when you hit it.
Speaker 1 (43:31):
So push yourself
harder than you thought was ever
possible and you have a chanceat greatness.
Bill, I want to thank you forsharing these insights as you've
gone through your journey.
I like to think of middle agenot as over the hill, but as in
in our primes and I say arebecause we're both right in our
mid forts now, and so thank youfor sharing where you are in
your prime and the prime of yourcareer Definitely have some
scars and battle wounds, learneda lot ups and downs, but your
(43:53):
trajectory I'm real proud to saythat I've been able to watch
that over is a piece of advice,right, and it's what you started
your talk with today just theimportance of the people that
are in your life and how peoplecame along, cared about you,
invested in you, mentored you,and now it's fantastic to see
(44:16):
you doing the same thing for somany others.
So I just I want to thank youfor that and for anybody who's
listening, I put that charge onyou as well, especially for our
students, to think about how youcan make contributions to the
people around you, some of whomyou might not even know, because
when you impact their lives,you can have a multiplier effect
out into the world.
So thank you, Dr Garland, forjoining us today, and I look
(44:38):
forward to spending a lot moretime over the summer as we get
into some of these new classesthat you'll be teaching.
And I want to thank everybodyfor joining us on this week's
episode of Mediascape.
And, Dr Garland, I hope you'llcome and visit with us again
soon To learn more about theMaster of Science in Digital
Media Management program.
Visit us on the web atdmmuscedu.