Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:05):
Welcome to the Analytics Power Hour. Analytics topics covered conversationally
and sometimes with explicit language. Hey everybody, welcome. It's the Analytics
Power Hour. This is episode 265. I think it was Socrates who said,
the unexamined life is not worth living. And I believe he said that
right before putting on his Oura ring, slipping on his whoop band and
(00:29):
jumping into his Eight Sleep bed. One thing for sure though,
we've got a lot more places to collect data about ourselves than we
did back in his day. And I think it represents some interesting possibilities,
maybe some challenges. So we wanted to talk about it. I mean,
we're data people, so who better to tackle this topic? And Julie Hoyer,
(00:53):
manager of analytics at Further. Do you use any of these tools to
like, measure stuff about yourself? Funny enough, I religiously wear an
Apple watch and it's collecting things, but I couldn't tell you the last
time I looked at the the dashboard summary data in the app,
if I'm honest. Nice. No, that counts though. That counts.
(01:15):
So Tim Wilson, head of solutions, facts and feelings. How about you measuring
your heart rate? I've got my Polar H10 heart rate monitor on right
now because just want to see how excited I get throughout this nice
show. Nice. So we should run that in real time along with a
podcast to see how excited or unexcited Tim is on a topic or
(01:39):
how stressed out it makes him. And I'm Michael Helbling and yeah,
I think I've got stuff on my phone that measures how many steps
I take and things like that. Okay, but we, we needed a guest,
somebody who could help shed some light on this topic and bring this
discussion to. To you, our listeners. So we found one. Michael Tiffany is
the CEO and co founder of Fulcra Dynamics. He was the founding CEO,
(02:01):
then president of Human, a cybersecurity company. He also serves on various
boards as well as advises startups. And today he is our guest.
Welcome to the show, Michael. It's a pleasure to be here.
Me and all of my connected devices. Nice. Are you big...
Do you do quite a bit of that? Or just I assume because
of your company, you probably do a lot of testing at least.
(02:24):
Rocking an Apple watch. I'm wearing an aura ring. I've got a connected
scale. I've got an Eight Sleep bed. I'm breathing into this Lumen device
to instrument my metabolism by looking at my out breaths. Here's how weird
I am. I'm rocking a smart, addressable breaker box. So among other things
(02:49):
I'm measuring... I'm like, monitoring power to the stove to just passively
monitor how often I'm cooking. Wow. Yep. That is a whole nother level.
What's the Eight Sleep bed? What's the Eight Sleep bed do?
Yeah. I haven't heard of that. Ah, it's magnificent. It's a bed that
circulates through... Interwoven through the entire bed topper are small
(03:13):
channels for water that run to a refrigeration slash, heating unit.
So the bed can either cool you down or warm you up.
Or in my case, key for marital bliss, cool me down while warming
my wife's side. Whoa. Wow. That sounds nice. But it also does a
(03:34):
lot of measuring of, like, your sleep quality and stuff like that at
the same time, right? That's exactly right. I was an early adopter.
Owning this thing feels like owning a Tesla where the same hardware has
been getting better and better with OTA updates. So while I bought it
mostly for that temper regulation, I've seen its sleep monitoring, its measurement
(03:58):
of my heart rate in the night, like, just get better and better
and more accurate, which has been a delight. Wow. Nice. Wow.
Yeah. I used to have an if this, then that routine running against
my scale to dump any weights I did into a Google sheet for
a long time, but that was a long time ago. And I think
the company that made that scale doesn't have an API anymore.
(04:19):
So that was like my gateway drug. Okay, nice. if this,
then that scripts to gather this kind of stuff. Yeah. And look at
me now. Nice. I do diligently when I'm traveling. I miss a little
bit that I don't have a scale because I have an every morning
that is part of the routine, not to the point of having a
(04:40):
connected scale. I actually was given a connected scale for Christmas,
I think, a year or so ago. And I'm like, I don't think
I need that. It's just take a measurement and punch it into my
phone while my toothbrush is running. But yeah, who knows? Yeah,
whatever works. Okay. All right, so... It's not a competition. What is...
(05:00):
Well, yeah, I hope not, because I'm not winning. Michael wins. Yeah. No.
All right. So. So, yeah, Michael, what is the word for this?
Like, so one of the things that gets used a lot is sort
of self quantification or self data. But what is sort of the holistic
term for this? Or what's going on in this space because obviously there's
(05:21):
even, we've mentioned a bunch of different companies and things like that,
but there's more. There's many, many more. That's right. And you can go
beyond that to like DNA like 23andMe and those kinds of things as
well. Yes, so in the early days, I would say the pioneering hackers
who were coming together and sharing tips and tricks were talking about
the movement as quantified self. And that really was in its pioneering phase.
(05:46):
These days, like I just showed you my Oura ring. They've surpassed a
million sales in North America. This is now a popular device,
not just a niche device. And while that has taken off,
quantified self as a term of art, I would say has actually declined.
And this is a good thing, not a bad thing, because what quantified
(06:06):
self promises you by just the meaning of the words, is a bunch
of numbers. And that's not what people want. They want insights,
they want self knowledge, and they want increasingly connected wellness.
So you see now terms of art that they are more about connected
fitness, connected wellness, connected health. And I think that that captures
(06:31):
something important, which is the intention, the goals here. It's
not really about counting steps. It's actually about 10 more years
of a good life. Yeah. So what is... I guess, is there a
term or is there, is there a singular
idea or vision that everyone says this is what we're trying to get
(06:52):
to is X, right? Yes. So, if I had to pick one,
it would be connected wellness. And the reason why it's those terms in
particular is that we're in a transition right now based on the recognition
that healthcare has for many, many years really been something more akin
to sick care. It's about fixing you after something is broken.
(07:18):
And that's not awesome. There are things you should be doing right now
to improve your wellness that mean that less things will go wrong.
So that's the... Apart from just branding and marketing. That's the true
(07:39):
reason why you're seeing the word wellness more. It's to try to differentiate
the proactive pursuit of optimal health versus recovery from something going
wrong. And then we're doing that in two ways that are new,
signaled by the word 'connected'. One is that we're wearing increasingly
(08:03):
smart devices that in effect make you like a type A personality,
like, make you like a really good diarist without you having to do
any work. I just step on my scale, I don't write anything down,
which is nice. And so it's connected in that sense. The device is
some somehow probably really sending bytes over the wire and then also connected
in the sense that this data by being digitally native is more shareable
(08:28):
with a doctor, with a loved one, maybe even just shared socially because
so much about staying fit and healthy is like depends on social engagement
and like doing it with others. So if I had to pick two
words to capture everything that seems to be
the ascendant term, it would be connected wellness. And who's...
(08:53):
This is funny. I think of the early days of internet of things
where there was talking of, if you're... Imagine your garage door being
able to tell you that it's got a bearing that needs to be
greased and it's gonna go out. Which sometimes those seem kind of forced.
I don't spend a whole lot of time feeling like I
(09:15):
need to preemptively maintain my garage door opener. It will break every
10 to 15 years. But when you talk about the health,
the logically early detection, early detection/preventative care makes sense.
(09:35):
Is the thinking that that is in the hands of a person,
is it a thinking that it's in the hands of the...
I mean the data collection has to... Is geographically tied to the human.
But is it something that the healthcare provider will say
I need your historical data, if you have it. Or like who's...
(10:00):
Yeah, where does it come from? Here's how I'm approaching this in my
own life. And I found this to be transformative. Actually goes back to
Michael's opening observation about Socrates. Self knowledge is incredibly
hard. It's actually incredibly difficult to achieve extraordinary self knowledge.
(10:24):
And so the way it's done, the best way to achieve extraordinary self
knowledge and insight for the past several thousand years, going back to
Socrates, going back to Vedic religions in India or even I was just
looking at the Rule of St. Benedict some 1500 years ago.
(10:45):
He's writing everyone does the same thing worldwide, which is dramatic simplification.
You live like a monk. This is the point
of the monastic life. It's to dramatically simplify your life so then you
can focus and achieve extraordinary self knowledge and insight. And sometimes
(11:06):
you peer into the very nature of reality as well.
I don't want to do that. I want to have the self awareness
of a monk while actually engaging with the world like a bon vivant.
And so the challenge I set before me, being a computer nerd is,
can I use computers to help me out in this regard?
(11:30):
Because computers are infinitely patient. And honestly, they're really good
at counting stuff. I believe that kung fu masters centuries ago really could
cultivate the ability to just be constantly aware of their own heart rate.
And that was probably awesome. I'm not going to do that.
(11:50):
I'm going to put on an Apple watch.
So that's sort of an empowering view of the world. But I would
say that something must be missing because the people donning Apple watches
or Oura rings or other kinds of instruments, instrumentation, are augmenting
their bodies, they're augmenting their lives with breakthrough technology
(12:12):
that was sci fi just decades ago. But I don't think we feel
like the Six Million Dollar man. You strap this in and you
just feel magically empowered. So what is it like what's missing?
And I think that siloization is a really big limiting factor.
(12:34):
And I'll give you a healthcare example and we'll go back to like
my connected breaker box. My bed has all these awesome instruments. It's
measuring my HRV. It'll tell me how long I spent in deep sleep,
but it knows nothing about what I did
the day previously that contributed to or ruined a good night's rest.
(12:59):
I, for instance, learned and other Fulcra users have seen the same thing
that by getting passive telemetry on my eating. So I'm not even a
big food logger. That's like a little bit too much work for me.
But I will put on a CGM. So I've done multiple experiments wearing
a connected glucose monitor, a continuous glucose monitor that just passively
(13:23):
recording my blood glucose. So therefore is going to see blood sugar spikes
when I've eaten a bunch of carbs. And what do you know,
like a few weeks worth of experimentation showed that if I want better,
specifically deep sleep, I should shift my carbs if I'm gonna eat any
to the beginning of the day. So carbs before noon, I sleep well.
(13:46):
Carbs afternoon. Eh, you starting to get into a danger zone.
Like dessert after dinner, forget about it. I'm gonna have an elevated heart
rate and I'm gonna have shortened deep sleep. It is impossible to know
about that causal relationship unless you're somehow tie in the data that's
drawn from the CGM with the data that the bed knows about.
(14:07):
So We've surrounded ourselves with these ostensibly smart devices, but they're
not really smart, they're just data producing devices. The smartness comes
from a higher level of analysis. And I feel like people like me
are on the leading edge. We're geeking out on our own data, we're
(14:29):
doing data science on this raw data in a Python notebook which is
like too much to ask for from maybe the average person,
but that's going to be within the grasp of the average person to some
extent already. And to an increasingly large extent because of coding copilots.
(14:50):
So people who've never written a lick of code before are sometimes getting
like one shot outputs of functional code from ChatGPT. That means that what
used to be really esoteric data science skills are becoming increasingly
within the grasp of ordinary people. But only
(15:10):
if you've gathered and de siloed the data. Hence, my focus with Fulcra.
I think that's something that I've been thinking about a lot is how
even once you have all the data in one spot so that you
could use it to paint a bigger picture, ask more helpful questions about
your health. Right. How do we determine like what good looks like?
(15:35):
Because what's interesting is some of the devices seem to be making some
of that decision and determining what is a good range of these metrics,
other ones don't. They truly do just collect the data.
So it's interesting to think when you start to connect those things and
tie them together, kind of back to Tim's question. Does it become a
(15:55):
place where that is baked in so an individual can go ask these
questions and get those types of answers? Or is it more so that
the value is it's all together and you could take it to a
professional to help tell you what does this mean? Is this good,
is this bad? And based on that, like what do I do about
it? Yeah, yeah. I'm thinking about thinking about changing the world in
(16:17):
this order. Once you have the self knowledge that I'm describing,
then you also have new ways of sharing how it's going in your
life with another person, which could be a doctor, but could just be
a spouse, could be a group of friends. So everything starts with solving
the observability problem. I think it's too hard to get help because
(16:41):
it takes so much effort to just describe to anyone else like this
is what's going on with me, this is how I've slept the last
week, or this is what's stressing me out.
All of that data you can think of as the human equivalent of
what we call in DevOps observability. The instrumentation, these connected
(17:04):
devices, they're solving the observability problem. Then there's like this
analysis problem which we just sketched. And then finally there's new forms
of sharing. And I'm like really excited about that.
I want to know how my friends are sleeping in general.
(17:26):
How is it going with people that I love but now live distant
from me and also what's normal. So what I'm hoping is by reducing
the friction and the risk of sharing personal observability data like this,
(17:48):
by making it secure and controllable, then we'll also be able to
pool this data to find out what's normal across larger groups.
So you can kind of compare yourself to, to averages. Right now it's
like really hard to tell, am I a weirdo? And I
(18:10):
think the internet is sort of good at solving those problems
if you can build the bridge between the data collection
and the kind of social sharing that you want to do.
Hmm. I've got anxiety now. As it is with... I mean with Strava
or, I mean I had Fitbit before or Apple. I mean there does
(18:34):
feel like a broad parallel that is not encouraging, which is move away
from us measuring ourselves and just kind of the world of digital where
at a corporate level there is this obsession with let's gather everything
we can. I mean, the 360 degree view of the customer taken to
(18:57):
an extreme would be a marketer knows how often you're cooking so they
can make self easy, cook meals available to you or something.
Yeah, right. I mean, there's the nefarious which I feel like insurance and
government we should get into as well. Right. But just the idea,
I mean there have to be people listening because I'm experiencing it a
(19:20):
little bit myself. Like oh my God, like sharing, comparing, like
don't we have a challenge with our youth just from the crude form
of TikTok and Instagram comparing themselves and it's not good for their
mental health. That's right. So it's like this gather all this data first,
(19:43):
hope the analysis happens and then we're creating community. Is there a
dark side or downside to that that we need to figure out?
I think so. I think there's extraordinary benefit and an extraordinary risk
and that's why I, an entrepreneur most known for starting cybersecurity
(20:06):
companies have... That's why I've waded into this. Our design of Fulcra
importantly starts with who we're working for and how we make our money.
When you create an account with Fulcra, your data belongs to you.
You are not sharing it with us, you are sharing it with your
(20:26):
future self. And our revenue model is asking for money for that service
and we need to re earn our customers trust every day.
And if we lose that trust then they will stop paying us money
and we will be very sad. So I think that being a force
for personal data sovereignty in this way is something you have to choose
(20:48):
to do at the foundation of your company and build into your DNA.
I think that if you are an ad funded company, even if you
are a multibillion dollar or multitrillion dollar ad driven company, you
cannot just decide to like pivot into a new line of business where
customer data belongs to the customers and is encrypted in motion and at
(21:11):
rest and is just designed for whatever the customer wants to do with
it and nobody else. The control that people have over their own data
I think is actually going to be of increasing importance as AI agents
become an increasingly important part of the future. Because as we can see
over the last especially two years of rapid improvement in generative AI,
(21:38):
it's going to be very hard to control AI models by trying to
put a cap on their capabilities. I don't even see how that's going
to work. I don't think we can say AI can only have an
IQ of 140, no higher. That's just not going to work.
So how are ordinary people going to have any control
over an agent that they're asking to help? So you want to get
(22:02):
help from a helpful AI assistant. How are you going to be able
to accept that help share enough data with that agent that you can
get some help? But make it like a two way door,
make it a revocable commitment. And I think there's only one way to
do that and that's to control access to your own data.
(22:22):
So you can grant it to an assistant. You say sure,
you can read my health data but you can't copy it.
And if I change my mind for any reason or no reason at
all, I get to turn that off. If instead all of our data
is going to live with some large tech provider that's also running the
models. If the only way you get the help is by like uploading
(22:43):
all of your data in a one step process, you've completely lost control.
And that's like not the future that I want to bring about. So
what we're trying to do here is empower people, as I said,
with self knowledge, but it's even more broadly building an important force
for personal data sovereignty so that we can have the benefits of AI
(23:04):
but put people in control. It's interesting too with being health data.
I think it brings a very different awareness to
the world of AI and the sharing of data and your own data
that I think people... It's very different than today. Like some people
think like, oh, you care about what I clicked on, what ads I
(23:25):
saw. It's your data. But it feels really different when you start to
talk about like your personal health metrics. And so it's... I'm really
happy to hear you talk about it that way and it's really helpful
to hear you talk about that way for even just my understanding of
like, what could this look like, what should this look like ethically in
the future? But I really hope that it kind of sparks that light
(23:47):
bulb for other people of like, when we're talking about your data and
privacy and the importance of it and how it interacts with AI. Yeah, thinking
about it, the way you think about your personal health data for all
your other data, I don't know, it really sparked some clarity for me. But
it also highlights the gap we have in the United States around data
ownership and data rights as a person because there's not laws in the
(24:10):
US about if you give that data to somebody else, what they can
use it for. And so health data can be predictive of many different
things potentially. So just like the car insurance companies want you to
take the little thing and plug it in to track all your movements
to save you money, but in reality, it's helping them create better predictive
(24:31):
models for what the likelihood you're going to get in an accident is
and what the risk you have to them as an insured person is.
And so in the same way, where that data goes. And so like
even if you take your data, and I think this came up with
23andMe because, because I think they were contemplating selling the company
to somewhere else and it's like, well what, what happens to all that
(24:53):
data if someone else comes and buys that company? What are they allowed
to do with that data if they acquire it? What happens when meta
buys Fulcra? Well, I mean, and so like that's a legitimate concern because
there's no underlying regulatory structure that says the someone who comes
along and buys a company like that can do or not do,
(25:13):
do things with that data that they "own now." Yeah. Right. I love
this kind of thinking and I think that when you dig into privacy
by design at many companies, you find that there is this end state
where people just say, well, we'd never do that. Yeah. And that is an inadequate
(25:36):
answer because you cannot guarantee that you will always have your hands
on the wheel. So in fact, I would encourage anyone listening as they're
thinking through what privacy by design at a Olympic level really looks
like. You have to show how you are preserving privacy even if Ultra
(25:58):
Super Mega Evil Corp acquires your company. You actually need to limit
the powers you have as a business operator to mess with people's data
and inspect their data so that even under the conditions where you're acquired
(26:18):
by a company that doesn't share your values, they can't just like switch
on the data vacuum mode and undo all of your work. And absolutely, this
is not just me thinking about this. Happily, there are good patterns of
privacy by design that are built to operate at that high level.
And I think that's absolutely the level that literally every company should
(26:39):
aspire to. But there's the, there's having, there's following all the principles
of privacy by design and putting something in place. And then there is
also the... I mean, you sort of said it earlier, there needs to
be a trust that somebody's going to provide their
data. And explaining there still winds up being truck to the masses to
(27:04):
those million people with an Oura ring if you say... I mean,
I would guess that most of them are saying, I don't really care.
I'm not giving it a whole lot of thought. Take my data.
But if you're going with 300 million people and the truly paranoid fringe,
the... When we are in a very weird little subset of four people
(27:25):
here who are happy to spend an hour talking and thinking about this
and we're not remotely scratching the surface of what's actually going on
in design to make that happen. So actually convincing Joe Smith that,
no, this really is okay and maybe this becomes just a societal breakdown
(27:47):
thing. They're like, says who? My cousin Vinnie said you're going to use
this for nefarious purposes and no amount of rationalization will change
their mind. So to me, this is a dimension of business design.
I'm a business nerd. And an observation that I've had is that whatever
a company says its mission is, if the execution on the mission is
(28:12):
not exactly what earns them money, that's not the mission.
The revenue is the mission. Over time, if these two things are not
in alignment, I'll tell you which one wins. It's the one that increases
earnings. So you can just know that and then you can consider that
a constraint of business design and then construct
(28:34):
a revenue model that is truly consistent and in fact, even supports your
mission. That's one of the things that I'm most proud of with the magnificent
success of Human. Cybersecurity. Company fights cyber crime at scale, goes
after the profit centers of cybercrime, importantly, doesn't have to sell
to the CISO. It's not just another layer of protection. If you're in
(28:58):
the business of fraud detection, you actually reduce losses due to fraud.
And so the reason why you get paid is that
you charge less than the savings. So then every single customer knows exactly
why they're paying you. And the incentives of that company are such that
(29:21):
Human makes the most money by going after the biggest source of cyber criminal
profit, which therefore means that it is designed to have the biggest possible
positive effect on the world, which is super cool. Here with Fulcra, here's
the way I see this playing out. Lots of people
(29:41):
consider the universe of everywhere there's data about you. Everything you
use that generates some data. So Facebook knows some stuff about you and
Apple knows some stuff about you. And maybe the Oura ring has a
little bit of data. And I don't think you need
to go around like deleting all of that. But
(30:02):
if you and only you have the superset, if you have all your
data from every single one of those sources, then you're the only one
who has the complete picture. And you could decide to then invoke some
right to be forgotten, or you ask for all your data to be
deleted and then you'll truly only have... You'll have the only copy.
(30:23):
But I think it's good enough that you are the master of the
complete set, because that'll alter incentives going forward where some
people who just already have some sliver of data about you,
they don't have to ask permission, they've already got it. But if they
want to have access to the full picture to provide a better service
(30:46):
or whatever, they have to ask you. And
to a great extent I think that's winning. If individuals are just in
charge and could say yes or no if they're asked at all,
that would be pretty great. Right now, the in real time bidding for
like most of the ads that are getting served to you,
even though you've had to answer a bunch of nonsense cookie consent pop
(31:10):
ups, like no one's really asking your permission for doing some kind of
cookie or pixel sync that is connected to some email newsletter that you
signed up for that they're using to figure out how many people are
in your household and what your income is. You were just not involved
in any of that. And that's the little turn that I want to
(31:31):
just make on society. And we could do that through lawmaking. We try
to force people to ask for your consent. But I think what's even
better is to reward them, to rationally motivate them to deal with you.
Because if they deal with you, they get better data and will deliver
you a better experience. So they'll do it if it's in their best
(31:52):
interest. And I think that happens when people are in control of like
the super corpus. You bring up a point that I actually would love
to kind of circle back on because it goes in two areas we've
talked about. One, I do feel like if there was clarity,
so say you were the owner of all your data. I feel like
the only way to get people to share their data openly,
(32:14):
like on a large scale with companies is if those companies could tell
us as the individuals, like we would love this type of data from
you because then we could answer these types of questions. Here's the benefit,
like that value trade off they talk about, like if you're allowing cookies
what does it get you in return? Why should you share this with
this company? But what's really interesting is we know that one doesn't
(32:36):
happen. I think it would be amazing if it could. But because we
know that people aren't starting with a question in mind always.
There is still the obsession that we talk about a lot on the
show that companies have of like just collect all the data.
And I do feel like it goes into the connected health conversation we're
having of people think if I have all the data on myself then
I'll be able to answer all these amazing questions. I don't know what
(32:59):
questions I'm exactly going to ask, but if I have all the data
it I'll be able to. And then you get into the reality of
a lot of these questions you can't answer or you're answering them with
data that you inherently realize has biased or errors in it.
So then it kind of takes you down the path too of like
there's a whole area of the industry that's spun up then to collect
more and better data but we're still probably going to miss the piece
(33:21):
of like what's the motivation of collecting all this data? What do companies
want to ask and use it for? What do you yourself want to
ask? What's a helpful question to ask? What should you be collecting data
to then get out of it? So I know there's kind of like
a lot of branches we could take off that but it's just been
interesting hearing the last couple points you've made.
I'll throw this out there as like a concrete prediction of the future.
(33:45):
I think the way this plays out is that there's like too much
data for a human to sort through. There are too many potential use
cases for it all. But it really does seem to me like we're
headed to a place where helpful AI assistants are within everyone's grasp.
So what I think will happen is you will have a kind of
(34:06):
concierge agent concierge that only works for you, that has trusted access
to your data and it intermediates with other companies agents and essentially
like negotiates on your behalf. So instead of you having to deal with
a whole bunch of questions about consent and individual offers, there's
(34:29):
just going to be too much sort through but you'll be able to
delegate it to your agent and just be like, show me the two
marketing offers that, that you think like are really going to land with
me. That's like all... That's how much attention... That's how much human
attention I actually have. And so the... Your agents might be dealing with
like countless kinds of unsolicited offers or ideas and is providing the
(34:52):
curation layer based on knowing you and then in a rules based way
can share like the little subsets of data that are going to be
able to activate those offers or make them work. I see.
If I'm right, that means that agent to agent communication is going to
be the majority of internet traffic within like 10 years. That's kind of
scary to think, though, they could be talking on your behalf in the
(35:14):
background and then that becomes its whole own black box. It's cool,
but it kind of scares me too. Just as long as they open
the pod bay doors well. But I mean, back to that,
I think there is this, and I know I've run through it when
I've made a Fitbit, which I don't know, I've probably gone through six
(35:37):
Fitbits over I don't know how many years. And when I switched to
an Apple watch, there was... I genuinely felt a, Oh, my God,
I'm like losing all this historical data. And I draw that parallel to
the business world. The reality is, is I bet when somebody has a
(35:57):
heart issue, they get sent home with a heart monitor and they say,
let me collect a couple of... Let me collect a couple of weeks.
Wear this for a month. Like, so as you're talking AI agents, my
brain went off on a, I want to lose weight or I want
to sleep better, or I want to do X or Y.
(36:19):
Here's all the data that I'm already collecting in an aggregated way.
Here's what's already there. What can you do with that? Have the agent
tell me, you know what, you should put a CGM on
for a while, but not turn it into this...
There's nothing in my entire history of working with analytics that makes
(36:40):
me think that anyone is going to be good at saying collect this
data for a while for a specific purpose. Because there's... Well,
just in case. Imagine the next time you ask, if you've already been
collecting that, then you don't need to collect it for another
two weeks. So the exchange, you just... You two just had had me
(37:01):
thinking like, is there a data... Because that's one of the privacy by
design principles is around like collect the minimal amount of data. Where
does that fit into it? That don't collect it just in case you
need it. Collect it once you know what you need. But this nebulous
(37:21):
get everything and then we'll have the most to work with.
Some of it's not going to ever matter. Or not matter enough to
make it worth it. Yeah. And in the name of prevention,
it's kind of hard to make that case. Yeah, that's right. Yes. So
in the longevity context, I think if you ever want to train an
AI on yourself, you kind of want to have as much data as
(37:42):
you can possibly afford to have. So the things get different when you
think about data retention, when you're thinking about it for your own purpose
versus regulating businesses for their commercial purposes. One of the reasons
why we felt honestly compelled to create Fulcra was because of the data
lossage that you just talked about. The fact is that
(38:05):
Geocities died. It turns out the internet isn't forever. Data will just
completely go away. And you've got a host of options for saving files Dropbox,
Drop, Google Drive, Apple's iCloud. But there's no streaming data store
(38:26):
for consumers. There's no like Kafka for people. So for data like your
location history, your calendars, any biometric. My heart rate just keeps
happening. Thank goodness. So it's not a file.
It'll never be a file. It is a stream. So I need a
streaming data store for it. And there literally were no options,
(38:49):
so we had to write one ourselves. And with the way I see
this being brought to bear over time is that all of these data
streams that I have pouring into my Fulcra data store are capturing how
I live and how I live and what's going on with me.
Situational awareness is one of the things you need to give to a
(39:14):
potential assistant so they can actually be helpful. Right now we're all
experimenting with chatbots where you have to initiate every conversation
and that's really limiting. I want to live in a world that's more
like what you just described, Tim, where some external source of intelligence
points out what I'm missing. Tells me about a thing I wouldn't have
(39:34):
thought of and is like, dude, you need to put on a CGM
for a couple of weeks. It's not going to be forever.
We just need to sort of sample this diet of yours and see
what is up. I think a lot of people want that.
Kind of like, I'm looking out for you proactive guidance. Now I'm going
to go weird, which is I want to leave this data corpus behind
(39:57):
for my errors and to make all of this data unambiguously mine and
unambiguously inheritable. I need to collect it before I die. My kids are
not going to be writing to Amazon or whatever and being like,
Please let us export the data. It's over by then. You need to
(40:17):
have... It needs to unambiguously be yours before the event. And
what is all this data add up to? It adds up to how
I lived. It adds up to who I did the living with. You're
going to be able to, in some cases, probably recreate my tone by
transcribing this podcast and feeding it into ElevenLabs and capturing my
(40:41):
voice and you'll capture some of my vocal intonations. But none of this
tells you about all that tacit stuff. All the procedural knowledge. So an
AI model that's trained on me, that lives on after me,
is a model that I hope will bake cookies with my great,
(41:01):
great grandchildren. I'm extremely proud of my almond flour chocolate chip
cookie recipe. And it's not just about the ingredient list, it's about how
I do it. So you should be able to like, walk into the
kitchen in the future and boot up Grandpappy Michael
and we're gonna bake cookies together. This is gonna be great.
But only in the first part of the day, not later.
(41:24):
That's right. That's right. Yes. I was thinking there would be,
there would also be an agent saying, you have not asked...
Hey, I'm Grandpappy Michael and you haven't asked me to make cookies with
you in a while. Oh, my God. Oh, that's a little too on
the nose. Don't you want to connect with your ancestry?
(41:46):
Guilt tripping beyond the grave. Yeah. You never call.
Yeah, I mean, there's something that says like you, you could always be
making better choices day to day. There is
a bit of a bleak, hey, do you really want that next...
(42:09):
I know you made the cookies. That's good. But you really...
Do you need the third one. We've been monitoring you and
I don't know. I mean, it's... It is interesting because obviously this vision
of the world creates and it kind of brings to life some very
interesting possibilities. Kind of like you've been talking about Michael,
(42:31):
and then some concerns as well. And so it'll be very interesting to
sort of see how this progresses. And the one thing, unfortunately,
we can't progress much further. We do have to start to wrap up
because we're running out of time. But it's... This is pretty fascinating.
And at the same time, sort of like, I think on the downside
(42:52):
risk part of it, we all sort of envision that guy,
Brian Johnson is his name that sort of like measures every possible thing
and wants to live forever and we're sort of like, yeah,
I don't think that's me, but I think there's a... Somewhere there's a
happy medium. Did you catch recently he found out he was doing...
There was one of the things he was doing that was actually
(43:12):
working in the opposite direction. I can't remember what it was,
but... Well, that's comforting, actually, a little bit. So that's fine.
But it also is kind of exciting to sort of think of yourself
like Neo in the Matrix and you turn around and be like,
I know kung fu. Because I didn't have to study to become a
kung fu master, but now I have these AI assistants and data that
(43:34):
helps me do the things they could do, like understand my heart rate
and those kinds of things. Making a data driven decision that one of
your health interventions wasn't working is kind of where we all need to
be. Instead of absorbing the recommendations that supposedly worked for
the 22 people in the double blind clinical trial, but might not work
for you, the question is, what works for you? Specifically you.
(43:56):
And then you want to double down on those and stop the ones
that don't. So I'm optimistic about that kind of tuning over time. I
think lots of people are going to live for a very,
very long time from here. Yeah. Until we upload ourselves into the machine
God. Oh, wait, did I say that loud? Yes, that's right.
Yes. Bring on the silicone brains. I mean, we didn't even touch on
(44:18):
Neuralink, so that's a second episode maybe. Okay, we do have to wrap
up, but one thing we like to do is go around the horn,
share something that we think might be of interest to our listeners.
It's been a really awesome conversation though, Michael, and thank you so
much for joining us to do it. But yeah, you're our guest.
Do you have a last call you'd like to share? It is outrageously
(44:39):
cold here in coastal New Hampshire. It's going to get down to 3
degrees Fahrenheit tonight. To the first thing that pops in my mind is
actually just like my favorite new product. I got Inu heat gloves. I N U
heat gloves. So get this. They're gloves that take a battery pack.
(45:00):
The battery pack doesn't use some weirdo proprietary connector. It's USB
C, thank goodness. So I like charge the battery packs with a USB
C outlet. I snap them onto my gloves and oh, my God.
They really do work. Just... That's awesome. It's so cool. That's so cool.
(45:21):
I just got a fleece for Christmas that does the same thing and
it's... You literally just hit a button and it turns on and it
warms you up all over. Michael Helbling, this is... Julie got the...
My son got my wife the same vest because we were gonna go
skiing and that's so... And Helbling had shown me his and I was
(45:41):
like, that's weird. And then realized that actually my wife had also gotten
one for Christmas and it's similarly like hooked. And she's had electric
gloves for a while. I know, right here I showed up and I'm
like talking. I'm like, oh, yeah. I have an addressable breaker box. Like
I'm doing all this like crazy mad science
but I swear to God, like the self heating clothing makes me feel
like I'm living in the future. Like, yes... That's awesome. That's so awesome.
(46:07):
That's awesome. All right, Julie, what about you? What's your last call?
My last call. I'm sure everyone's heard about the congestion tax in New
York. I know it's a big thing. And I had found the link
to the congestion pricing tracker and it's got some good data visualization.
I'm really interested to see as time goes on, like what do they
(46:27):
find? They even had done I think a good job like stating what
they're hoping will happen from it. So I love that they actually paired
it with, hey, this is how we're visualizing things. Want to know how
they're going to analyze it? What are their conclusions going to be?
But my favorite part, Tim, is that when I got to the bottom
of this tracker, it actually says that it is run by,
(46:48):
I'm guessing two students at Brown University and supervised by Emily Oster.
So I was like, no wonder. I love this. Oh, my God.
It's great. So I've just been peeking at it. It hasn't been running
obviously too long just for this year so far. But I think it's
really cool and I'm excited to see what comes out of it,
especially knowing that Emily is involved. So I think that might be a
(47:09):
last call that needs to become a future episode right there.
I think so. Awesome. That's so cool. I saw that same thing.
I was like, oh, my gosh. So that's so cool. All right,
Tim, what about you? What's your last call? So as I tend to
do, I'm going to do a three I think. So one... Three. I
want to call out back three. Yeah, they'll be quick. And Cassie Kozyrkov
(47:31):
will be included in one of them. So
one, back when we first started talking to the Fulcra team about having
Michael on for this, I actually tried out the, kind of hooked up
what I could and also kind of interesting even I can find things
that I'm not that connected. But there's no swarm connection. It's crazy
(47:52):
how many things we have that are tracking and the challenge of tracking
everything. But I think there is a... There's like a seven day trial
if anybody want. You just download the app and you kind of hook
up whatever services and you kind of get to see what the aggregated
data looks like. Is that right? Yeah, yeah, yeah. Everyone should give it
a trial. I think most people are surprised by the data that they
have and just didn't know about. You quite likely have years worth of
(48:16):
step count data that you didn't even know because it was sort of
silently turned on by your iPhone. Yeah. So it's always... Even if you
just want to look and you like delete the app after seven days,
like, it's a fascinating look. And it has a very well documented API
from playing around with it. So we're not... This is not a paid
(48:36):
endorsement, but this whole discussion, if it's got people thinking, oh,
that's kind of worth checking out. And I think actually hearing you talk
about sort of division kind of makes it a little more exciting,
get people thinking. Thank you. So that's one. Number two, just a PSA
for anybody who, if you're not already following Cassie Kozyrkov, like,
what the fuck is wrong with you? But you should have caught that
(48:58):
she's moved over to Substack and has went through her three weeks of
acting training and whatnot. So that's just in case, decision.substack.com.
And then my core last call is completely off.
Not really analytics, but over the holidays there was reasons that I needed
to explore new podcasts and I did not realize that Mike Birbiglia had
(49:22):
a podcast called Working It Out and he has his... David Sedaris comes
on as kind of for his second appearance and oh, my God. I
don't know of like two more delightful people than Mike Birbiglia and David
Sedaris on the Working It Out podcast. So if you're looking something for
that is just if you know either of those guys and their sensibilities
(49:44):
and you're into them, that was like, oh, My God. Just like heaven
of listening for 40 minutes or however long it was, has nothing to
do with analytics, but had to put in a plug for that as
well. What about you, Michael? Do you have six last call?
This is just... What's a four? Is that a thing? No. A quad.
(50:07):
A quad. And I've got an Octa. No, it's just really pushing the
limits. I have an AI agent that will talk to your AI agent
about my last calls. So we'll negotiate. No, so, no, actually.
So one of the things I ran into recently that I thought could
be a little bit helpful to our audience is a lot of folks
(50:29):
use Google Analytics for website tracking. And one of the things that is
kind of required to make that tool useful at all is to export
it into BigQuery. But what's in BigQuery is often quite different than what's
in GA4, and that's kind of hard for business users. So Himanshu Sharma
actually did a mapping of those metrics and the calculations to get to
(50:54):
those metrics pretty comprehensively from GA4 to BigQuery. So if you're
in that place where you're trying to navigate that, that could actually
be quite a good resource to use. Then alternatively you could also switch
to a better tool. But in the meantime, that is one that I
would say is one you could bookmark and use as a good reference.
So. All right. So I think this has been so fun to kind
(51:19):
of dive into this conversation, Michael. So thank you so much for joining
us. It's been really cool. I love kind of hearing your vision for
the future, the way you're thinking about this, the way that the world
is progressing on these fronts is kind of a very cool frontier that
we're on again both with AI and the connected health and the connected
self. So it's... I really appreciate you kind of sharing some of your
(51:44):
thoughts and things about that. And I'm sure as our listeners have been
listening, they might have some thoughts and feelings about this as well.
We'd love to hear from you. So go ahead, reach out to us.
The best way to do that probably on LinkedIn or you can connect
on the Measure Slack chat group or also by email contact@analyticshour.io.
(52:07):
Michael, do you... Are you active on social media at all?
Could people find you out there? Yeah, principally find me on LinkedIn,
the company Fulcra Dynamics, also on LinkedIn. And I'm on X with my
old school hacker handle of Kubla, K U B L A.
Hit me up there and find Fulcra on X as well.
(52:28):
Awesome. Thank you. So that you can reach out to him as well
and follow him on those channels as well, so you can hear what
the latest and greatest is in this crazy changing world.
All right, well, hey, listen, one of the things we're trying to do
this year is make sure that people get access to this show.
One of the ways that you can help with that is
(52:49):
putting a rating or review where you listen to the show,
whether it's Spotify or Apple or wherever. We would love to have you
rate the show. Review the show. That really helps, apparently algorithmically,
until AI can take over and recommend us to the right people. IHeart radio,
is that right? That's really where we're. Yeah, we're really targeting that
(53:13):
one. That's where our... That's where our listener lives is... Anyways.
AI Heart Radio. AI Heart Radio. I think we...
Let's package that up. We'll get a series A in no time.
So. But yeah, if you're listening and you haven't done that before,
(53:36):
we'd love it if you could. And we always just love hearing feedback
about the show as well. And it helps us think about the future
of the show. So really appreciate it if you can. And no show
would be complete without a huge shout out and a thank you to
our producer, Josh Crowhurst, for everything he's doing to make the show
possible. So thank you, Josh. And I am sure I speak for both
(53:57):
my co hosts, Tim and Julie when I say, no matter what,
what your device is and what you're measuring, just remember, keep analyzing.
Thanks for listening. Let's keep the conversation going with your comments,
suggestions and questions on Twitter @analyticshour, on the web at analyticshour.io,
(54:20):
our LinkedIn group and the Measure chat Slack group. Music for the podcast
by Josh Crowhurst. Smart guys wanted to fit in, so they made up
a term called analytics. Analytics don't work. Do the analytics say,
go for it no matter who's going for it. So if you and
I were on the field, the analytics say, go for it.
(54:40):
It's the stupidest, laziest, lamest thing I've ever heard for reasoning
in competition. Rock flag and keep analyzing me.