All Episodes

February 22, 2023 21 mins

In this episode of Expert Insights Series, hosted by BI Consultant Greg Brown, we feature Brian T. O’Neill of the company Designing for Analytics. Brian shares his deep experience in how to create value from data projects and the impact of human-centered design on business outcomes.

Designing for Analytics website: https://designingforanalytics.com/
Experiencing Data podcast: https://designingforanalytics.com/experiencing-data-podcast/

Click here to watch this episode on our YouTube channel.

Blue Margin helps private equity owned and mid-market companies organize their data into dashboards to execute on strategy and create a culture of accountability. We call it The Dashboard Effect, the title of our book and podcast

 Visit Blue Margin's library of additional BI resources here.

For a free, downloadable copy of our book, The Dashboard Effect, click here, or buy a hardcopy or Kindle version on Amazon.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Greg Brown (00:04):
Welcome to our Blue Margin Expert Insights Series.
We're glad that you joined ustoday. This series is for
private equity and mid marketexecutives who want to use data
and dashboards as a short pathto increasing growth and
profitability. I'm Greg Brown,BI consultant at Blue Margin,
and today I'm hosting BrianO'Neill, Founder and Principal
of Designing for Analytics, aconsultancy that helps data
product leaders increaseadoption of machine learning,

(00:26):
AI, and analytic solutionsthrough human centered design.
Brian has been a productdesigner and UX consultant for
over 25 years, partnering withcompanies like Dell, EMC,
TripAdvisor, fidelity, NetApp,Roche, and Advi. Brian has
spoken internationally atmultiple O'Reilly conferences,
the International Institute forAnalytics Symposium, Predictive
Analytics World, and BostonCollege. Brian also hosts the

(00:49):
very popular five star podcast,Experiencing Data, where he
speaks with leaders at theintersection of design, machine
learning and AI, analytics anddata product management. Be sure
to check that out. And ofcourse, we'll link Brian's
podcast in our show notes.
Brian, I'm excited to have youhere today to discuss how your
world of designing for analyticsintersects with real business
results and hopefully to pickyour brain a bit about the role
of strong data product designfor mid market companies.

(01:11):
Welcome to the show.

Brian O'Neill (01:14):
Hi, it's great to be here.

Greg Brown (01:15):
Yeah. Awesome. And Brian, just to start us off,
tell us a little bit aboutyourself, your background, your
value proposition and who youtypically partner with.

Brian O'Neill (01:24):
Sure, sure. Well, It's great to be here. Thanks
again, for inviting me on. Myformal background is actually in
music, we don't have to go intoall that, but I do have another
career as a professionalpercussionist and drummer. So I
tour and perform with orchestrasand all that kind of stuff. But
along the way, in college, whenI was focusing on my music
studies, I was learning aboutthe web. And so there was

(01:47):
actually a bassoon major livingnext door to me in my dorm, and
he was always tinkering aroundwith with telnet, and browsers
were just coming out in thatwhole era. And long story short,
he kind of got me interested inthat, and by the time I
graduated with my music degree,I had been working as a web
designer, mostly doing marketingwebsites for northern Arizona

(02:07):
tourism industry there, and hadreally cut my teeth as a
designer and sort of developer.
Back then you kind of dideverything. So fast forward, you
know, moved to Boston later onand went more into the world of
startups and software, and overtime, really ended up focusing
on B2B and enterprise. And theneven more specifically, down

(02:27):
into the analytics and dataworld, largely through work on
IT software, so software tomanage data centers. That and
financial services. So my timeat Fidelity and JP Morgan and
E-trade, a lot of tradingsystems, things like this. So
that's all data, quant, thatkind of stuff, data

(02:48):
visualization. That's reallywhere I kind of cut my teeth
with this. And then when I wentfull time with my consulting
work, and got out of the W2employment world, I decided to
focus in this area, because Iwas seeing where there was an
opportunity for human centereddesign and product management
principles to be applied to thisworld of data science and
analytics, particularly in thenon-tech company, the non

(03:10):
software industry area. I wasn'teven aware how much analytics
was an internal practice atorganizations. And building
machine learning and analyticssolutions and dashboards for
internal use, not to sell as aproduct necessarily. So my two
audiences are still both thecommercial software industry,
mostly in the B2B data, productspace. But also in this, it's

(03:33):
usually more training in termsof the work that I do now with
clients. When I work with likean internal BI team or a data
science organization, theirchallenges are usually in the
low adoption issue. Like, wehave the technical capability,
we know what we're doing there,we give our stakeholders what
they ask for, and then theystill don't use it. Like they

(03:54):
don't trust it, or they don'twant it, or they say it's wrong,
or we find out what therequirements are after we give
them what they asked for. Andit's this kind of data tennis
game, which is like, "Well,whose job is it to define what
the problem is and what the needis?" And then you have the data
scientists complaining about,like, "Well, what's your
business problem that you needhelp with?" And the stakeholders

(04:15):
saying, "Well, we want to usemachine learning. So what's
possible? What can we do withAI? Because that's what everyone
else is doing." And then they'relike, "Well, what are you trying
to solve?" "Well, what'spossible?" and I call this like
the data tennis game. And thisfunction of human centered
design and product management,which is really about falling in
love with the problem space,this is missing, and most of the
time, presenting problems arewhat teams are working on. I

(04:40):
forget who came up with thisterm, I love it, but this is the
idea that like, "I've got a rashon my arm, so I need some skin
cream, doctor." And it's like,"Well, maybe you need to change
your diet." So if you're just inthe business of applying skin
cream every time someone saysthey have a rash because that's
what they asked for, this iswhere problems can set in.

(05:01):
Because that's a presentingproblem, there may be an
unarticulated problem that'sunder the surface that we need
to get to. And most of the time,that's true. And this is where I
think things tend to fall apartas we haven't actually defined
what the real need and theproblems are. We're just taking
the surface request, and we'readdressing that through a Jira
ticket or whatever. And we'reassuming that the person with

(05:24):
the need knows how tospecifically specify what it is
that they need. And that'sactually a team effort and a
whole skill set that I think alot of data product and data
science leaders struggle withis, "How do I get those
unarticulated needs out?" And,you know, design is the process
of doing that. Design andproduct give us tools for doing

(05:44):
that. But that's the start ofwhere things start to go wrong
with a lot of these teams whenthey're struggling, in my
opinion. So that's kind of themonster that I'm trying to
attack with all of my work, andhelp companies with. is this low
adoption. As I always tell myaudience, low use is bad for
you, the maker, it's bad for therecipient, it's just bad for

(06:05):
everybody, because someone justspent money, they didn't get
what they wanted, your techdidn't get used. So if you're a
data scientist, you spend allthis time building a model
that's gonna go on a shelf,maybe you can write a paper
about it, but you didn't get anyusage out of it. Over time
that's just bad for everybody.
So using these different toolsis actually something that's

(06:27):
good for everybody. You'rebecoming focused on outcomes
instead of outputs. When someoneasked for a dashboard, they
don't really want a dashboard.
And my LinkedIn says, "Nobodywants your machine learning and
analytics." They don't. Theywant something that's downstream
from that, and we need to focuson what the downstream thing is
that people want. And that's acollection of both. There could

(06:49):
be financial and quantitativemetrics that go with defining
the value of the solution of thedata product you're building.
But there's also what we calluser experience outcomes. This
is where user experience designcomes in to actually be thinking
about workflows, or sometimesyou hear "jobs to be done," or
tasks. These are all the waysthat designers tend to look at
things differently than I thinka BI developer does, who's

(07:12):
really focused on just thedashboard. I'm always trying to
teach my audience, you need tothink in terms of workflows.
Workflows happen over time, theymay touch multiple different
tool sets, and we need to bethinking about end to end
workflows and designing aroundthat. And the only way you can
do that is to know what It'slike to be the person using it,
which requires you to spendtime, not building and coding

(07:35):
stuff, not working withdatasets, but actually going and
observing, and mostly listeningand observing to the people who
are going to use it. And we needto work backwards from their
objectives into something that'suseful, usable, valuable,
trustworthy. That's the humanaspect here, right. The "non
data first way." That's the"human first way" is we work

(07:57):
backwards from the need, andthen we assembled technology
that goes with the need. Wedon't start with a data set, and
then kind of putz around with itand hope that we build something
right. And then we accumulatetechnical debt, and then we have
sunk cost bias, then no onewants to throw away this thing
they spent a million dollars on.
I'm sure you know that wholething. So anyhow, that was a

(08:19):
long winded approach. I kind ofwent into a bunch of things
there.

Greg Brown (08:23):
No, absolutely. I love the analogy. I mean, a lot
stuck out to me in thatresponse, but I love the analogy
of the data game of tennis, soto speak, where you're taking an
unlimited input, you'revolleying something back over
the net. Instead of walking overto that side of the court,
really understanding theproblem, unpacking it, instead
of just volleying back littlesolutions or responses based on

(08:43):
limited input without digging alittle bit deeper, and really
asking the questions that youneed to. The other thing that
stuck out to me, and is a commonanalogy in different realms, but
the idea of patients diagnosingthemselves. You know, we
wouldn't walk into a doctor'soffice and have the doctor
accept our diagnosis of theproblem and then say, "Okay,
well, based on what you told me,here's what I prescribe to you."
They may accept some of ourinputs and say, "Well, that

(09:04):
gives me a starting point." Butthe doctor is always gonna say,
"Let me go through my processand ask my questions, and really
get a conversation going aboutwhat this issue is, before I
actually make a judgement and aprescription to you." I think
that's important, too, tounderstand that you don't want
to accept users, or a businessteam, self diagnosing, and only
accept that. You can use it as astarting point. But you have to

(09:25):
dig a little bit deeper andreally understand the need and
what they're looking for. So no,that was great. A lot of stuff
stuck out to me there, I couldrespond to almost every part of
it. But those were the twothings that really stuck out to
me the most. Have you ever seenthe team shift their focus? I
mean, maybe they were initiallyvery founded on the outcome, and
really using that to gear andfocus their design. But then as

(09:46):
people start using it, theystart to shift more towards the
output and they're focusing onthe wrong things. They're not
continuing to keep the businessoutcome as the guiding principle
as they evolve the product. Haveyou ever seen that sort of drift
happen with data products?

Brian O'Neill (09:59):
I think gets super natural to. Like,
eventually it's like, "Okay, weknow what the need is, and now
It's execution mode." Now It'sthe 90% of the project, which is
building the stuff, right? Andagain, I think, when there's no
sense of, "How would we know ifwe did a good job?" and being
able to test that, and this isagain, something you do not want

(10:21):
to wait until the end. Whetheryou're a consultant doing an
engagement, or you're aninternal data science leader,
you don't want to find out "Howdoes one keep score in this game
we are playing?" You don't wantto find that at the end of the
game. It's much more fun to findit at the beginning of the game.
And here's the thing, thestakeholders, the CIOs, the
business stakeholders, theprivate equity people, they may

(10:42):
not know what all the rules ofthe game are yet. They haven't
thought about it that waybefore. So this requires people
to work together about what itis. Like, "You want to reduce
attrition, okay, what's acustomer?" "Well, what do you
mean, 'what's a customer?'""Well, we actually have 15
different statuses forcustomers. We have people that
left and came back, we havecurrent ones, we have people

(11:04):
that paused their accounts, butthey haven't left. We have
people on $5 month plans..."Like, "Oh, I didn't know that."
"Great. Let's come up with adefinition of what a customer
is." You know all this stuff.
But the point is, the parametersfor the success of the project
need to be defined upfront, inclear ways, in clear language
that everyone can understand.

(11:25):
And they usually don't includewords like "machine learning,"
or "we used bayesian whatever,"or "we use this method" or
"Azure." Those words generallydo not belong in those things.
If you're there, then you'reprobably back to talking about
technical outputs again. So thedrift is normal, but I think if
you're having regular check ins,and you know how you're going to

(11:45):
measure along the way, so youdon't go way off into left
field, spend all this time andmoney and find out just how far
off things are. Modern producthas a propensity for wanting to
ship faster and gettingfeedback, as opposed to spending
a ton of time building a giantthing and hoping that it's
right, and then shipping it, andthen finding out. We want to

(12:07):
increase those learning loops,right, getting those learning
feedback cycles. So "Oh, that'snot what you meant, you meant
this." But that's different thanthe iterative approach, which is
like, "Well, right now we havethis metric so we put that one
out. And next month, we'll havethe pipeline for this and then
we'll add that." That's moreincremental development. That's
not iteration. That'sincrementing, right. We're just

(12:29):
adding more to the pile all thetime. This is different than,
what you might have heard of,Agile Software Development,
which is really about, "Is thedesign right?" And yes, there
may be features or data pointsthat will take more time to be
added. But if you're justconstantly adding, and you're
not actually iterating overwhat's there, and you're not
willing to spend the time tosay, "This first pass was wrong,

(12:53):
we're not going to continuebuilding on top of that, because
it's conceptually wrong." Andaccepting that we can't predict
what the perfect version isgoing to be. You can't pre plan
all that stuff all the time.
This is something that teamshave to swallow, right. And
also, the business needs toswallow this too. But shortening
the cycles helps us not spend alot of money and time working on
wrong stuff. We shorten thosecycles, but again, how will we

(13:15):
measure that we did a good job.
This is frequently somethingthat teams that I work with, and
my clients, usually can'tanswer. And we we end up doing
some kind of diagnostic. Youknow, if i'm in my consulting
board, there's usually some kindof diagnostic there, which is to
go figure out, "How would weknow if this product works? What
will we measure?" And some ofthose may be qualitative and

(13:38):
mushy.

Greg Brown (13:38):
And just to continue to thread a little bit on UX,
can you tell us why it's socritical to think about that?
You touched on it in some ofyour previous answers, but why
it's so critical to think aboutthe user experience when you're
designing a data product. We seeit as, if you have good design,
it can lead to a good userexperience that increases
adoption. And then it leads tosome tangible end results for

(13:59):
the business that really impactthe top or bottom line. Is that
kind of the right sequence andwhat we're looking for when you
start with, "Let me make sure weget the user experience piece
right." Because then it can leadto all those benefits down the
line.

Brian O'Neill (14:11):
Sure. It's funny, yes, I believe that, I generally
don't talk about it that waybecause most of my audience
thinks that user experiencesounds like, "We're going to
make it look nice, we're goingto make it fun. It's this extra
thing you do to take it fromgood to great, right?" Instead
of, "The user experience hasbeen designed in such a way that

(14:35):
it absolutely minimizes theamount of time you need to be in
there." Effectively, the tooltries to be completely invisible
as much as possible. It'll sendalerts out and the goal is you
never log into the dashboard.
The goal is that there's alistener that's listening for
abnormal deviations in the dataand it's sending you something
in email, and the email itselfhas everything you need to know

(14:56):
in it such that you don't haveto log in. That is also an
experience, right? And evenknowing that, "Oh, people don't
really want to use this." It'slike, "Well, why don't you want
to use it? What is it?" There'ssomething about like, "I don't
like having to talk about thenumbers, because I don't know
why my bonus is wrapped up inthis," or whatever, versus, "The
old tool sucked, and it took mylife from me, I can't stand it."

(15:20):
What's behind the "I don't wantto use it" stuff? Is it the
perceived labor? Is it thedomain information? Is it
personal risk? These are allthings that designers would be
looking for, and thenincorporating into the solution
where possible. So good dataproduct design, and I would
argue good even for enterprisemost of the time, is about

(15:42):
getting out of the way. Butdesigners are like, "How can I
get rid of stuff here?" Right?
So yes, I can relate to a lot ofwhat you're asking about here.

Greg Brown (15:55):
Yeah, absolutely.
And I've seen that happenmyself, where the iteration, the
"improvement to userexperience," is where we're
adding things. "We're going toadd this, we're going to add
that, it'll be richer, there'llbe more visualizations," without
realizing that, that just hurtsthe experience. That makes it
all much harder to take in andactually use the tool, or it
just serves to distract people,or draw their attention to more

(16:17):
things than they really need tobe focused on. And that's one
thing that we really focus on indesign is making sure, before
you add something, does itabsolutely have to be there? And
what are users going to be doingwith it? Otherwise, it is just
clutter that interferes with theuser experience, really. And
with someone enjoying the userexperience, I would say, so that
they come back and continue touse the product. So it makes

(16:39):
perfect sense.

Brian O'Neill (16:42):
You can't get to business value, if you don't
first go through user adoption.
The business value will probablyfollow if you build something
that people want to or arewilling to use, or they find
indispensable. Then the businessvalue will follow. But if you
don't design for the humanadoption piece, you definitely
don't get the business value.

(17:02):
All you've done is spend money.

Greg Brown (17:04):
Well, and I think it's very important for folks to
understand that the gateway tocreating business value is user
adoption. You cannot circumventthat, or shortcut that, and
achieve some magical mythicalbusiness value or create value
with data, inless you have thatuser adoption piece. That is the
only way that you're going toget there. And, just to shift

(17:25):
gears a little bit too, Brian,for some of the smaller middle
market companies that we partnerwith, they aren't always able to
approach designing acomprehensive data product
internally, they are interestedin providing some value to the
business and they're sometimesfighting some skepticism from
other stakeholders. It's kind ofa big question, but where do
these smaller data teams startto deliver value and create buy

(17:47):
in for data strategy, especiallyif they face some skepticism
because the company or the firmhas never used data in a
innovative way before.

Brian O'Neill (17:58):
So I mean, obviously, you need good
leadership here to connect, whatdata we have, what's technically
possible, what the business istrying to achieve, right? And so
even if you haven't done thingsthe data way, or whatever, my
thing is, and this is reallyimportant at leadership levels,
is that the person in charge ofthe team, the maker, I'm going

(18:21):
to just call them the makers,whatever data science or BI
whatever the heck it is, right?
They need to be able to reallyspeak the language of the
business. This is what thebusiness wants, right? The
business doesn't want to golearn about data science and
modeling and all this kind ofstuff. They don't want to know
that. What most of them want isthey want a team that can relate
to what their needs are. And sothis is a skill that I think a

(18:42):
lot of data teams do not have,which is this consultative and
research oriented skill set togo and translate what the
business is trying to do backinto data product work. This is
really important. So there areways to get better at this. I
kind of have this theory that,particularly in the data science

(19:05):
side, that this community tendsto be somewhat introverted. And
one of the things I teach when Iteach doing qualitative
research, or one on onelistening sessions and
qualitative interviews, is that,"The good thing here is it's
mostly about listening. It's notabout talking. It's just
listening." And so if you canask good questions here and look
at the questioning of theperson, not as challenging this

(19:26):
leader, but you're actuallyhelping them get to what they
actually need. It's a process ofpeeling the onion back together
by just asking guided questionsto get to the unarticulated
needs. We need to do this kindof work in order to design good
stuff. And you don't have totalk a lot. It's mostly about
listening.

Greg Brown (19:49):
the business team saying, "Well, this might be
something that we actuallyadopt, we might be on the right
track here. So I'm gonna seewhat we produce, what comes from
this. But I feel like we'reapproaching this the right way."
I think that's the mostimportant way to start creating

(20:11):
buy in at the outset of it, andto make sure that you understand
what you're solving for ofcourse, too. So Brian, what's
next for you? What's on thehorizon for you in 2023? And for
those that are looking toconnect with you further, what's
the best avenue?

Brian O'Neill (20:23):
Yeah, so I have the podcast. It's called
Experiencing Data. And I've gota mailing list. I write over at
designingforanalytics.com. Twicea year, I run a public seminar
called Designing Human CenteredData Products. This is for data
science leaders, data productleaders, BI leaders. It's
basically like, "How do I go doall this stuff and make it
immediately applicable." So thatruns in February and September.

(20:47):
I'm actually launching acommunity this year called The
Data Product LeadershipCommunity. For now, the DPLC.
It's probably going to changenames. But I'm actually trying
to create a place here wherethere's going to be different
perspectives here from the userexperience design world, the
product management world, thedata science community, and
probably some engineers inthere, but it's people that are

(21:08):
trying to use this productorientation in their work to
actually make sure that all thisstuff we can do with machine
learning and analytics actuallygets used and creates some value
here. So those are some thingsI've got on my horizon.

Greg Brown (21:23):
That's awesome.
Looking forward to hearingupdates on all of that, Brian.
And want to thank you again forjoining us here on the show. We
enjoyed getting a chance to hostyou today. And again, would
encourage everyone to check outBrian's podcast, Experiencing
Data, which we have linked inthe show notes. Thanks,
everyone.
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Boysober

Boysober

Have you ever wondered what life might be like if you stopped worrying about being wanted, and focused on understanding what you actually want? That was the question Hope Woodard asked herself after a string of situationships inspired her to take a break from sex and dating. She went "boysober," a personal concept that sparked a global movement among women looking to prioritize themselves over men. Now, Hope is looking to expand the ways we explore our relationship to relationships. Taking a bold, unfiltered look into modern love, romance, and self-discovery, Boysober will dive into messy stories about dating, sex, love, friendship, and breaking generational patterns—all with humor, vulnerability, and a fresh perspective.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.