Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hey, this is the Fit Mass podcast.
My name is Jeremy.
Thanks so much for listening.
Here on the show, we talk about all kinds of things regarding your mental health, anxiety,depression, ways to live with them, ways to try to live with them a little bit less.
Today, we're going to venture out to something a little bit different.
You know, I consider myself something of a geek when it comes to the various technologicalways to track the things we're doing so that we can try to make that little 1 %
(00:21):
improvement we're all trying to make every day.
So at any given time, I've got a number of different devices strapped to different partsof my body to keep track of those things.
So, you know, your smartwatch tracks your steps, your ring tracks your sleep, your appsall analyze your diet, but who has access to all of that personal health data?
(00:44):
This is gonna be fun today because we're gonna talk with not only one of my oldestfriends, but a incredibly smart cybersecurity expert.
name is Jason Hayworth, and we're gonna uncover some of those hidden risks of yourfavorite health tracking devices.
anything that's free or low cost is because you're part of the product offering
My hope is that you'll learn how insurance companies might use your fitness data againstyou, companies are cozying up to politicians, and most importantly, how to protect
yourself without giving up the convenience of modern technology.
(01:05):
So we'll take a closer look at the truth about data collection and health apps, essentialsteps to protect your privacy, why your fitness tracker might be sharing more than you
think, to help keep your health data secure.
so stick around, you're not gonna wanna miss this crucial conversation about protectingyour privacy in an increasingly connected world.
My conversation with Jason Hayworth.
is coming right up.
(01:27):
What if your mornings didn't start with an annoying jolt, but with a calm focus andenergy?
If improving your sleep routine is a top priority for 2025, meet the Lofty Clock, abedside essential engineered by sleep experts to transform both your bedtime and your
mornings.
Here's why I find the Lofty Clock to be a game changer.
(01:49):
It mimics your body's natural rhythm.
wakes you up gently with a two-phase alarm.
first a soft wake up sound to ease you into consciousness, followed by a more energizingget up sound to help you start the day.
Lofty is not just an alarm clock.
It's your all-in-one bedside sound machine.
over a hundred options, including white noise, nature sounds, meditations, and evenbedtime stories to help you relax and unwind.
So get that phone out of your bedroom.
Let's face it, grabbing your phone to set an alarm or listen to an audio book often leadsto an hour or more of mindless scrolling.
(02:16):
Join over 100,000 blissful sleepers who have upgraded their rest and mornings with Lofty.
Go to buylofty.com and use the code fitmas20 for 20 % off orders over $100.
That's b-y-l-o-f-t-i-e dot com the promo code
Fit Mass 20.
right.
So Jason, you and I have known each other for a long time.
(02:37):
You're one of the smartest people I know.
So I'm going to do my best to keep up in this conversation, but I am currently wearing anaura ring, an Apple watch.
I've got my Apollo band.
I use AI all day long for my work.
I love it.
I live by it.
makes a lot of my decisions for me.
It does a lot of my work for me.
But this may not be the smartest, safest path to be going down.
(02:58):
Is that a fair assumption?
Well, I don't know about that.
It's balancing, right?
So, I mean, let me back up a little bit and say, first of all, if I'm one of the smartestpeople, you know you need to expand your sphere a little bit.
I'm just old and I've been doing this stuff for a long time.
We'll add up the how many times I say I don't get it at the end of this conversation, thenwe'll decide.
(03:20):
glad we're recording it.
Yeah, so all the devices that you're wearing.
So I mean, I think you mentioned four of them, but there's probably more of them that arearound in your environment and your home and everything else you're not even aware of.
So I mean, you're really talking about biometric trackers.
So things that are looking at signals that your body's giving off and taking those andsending them off to some ephemeral database out in the internet and whether that ephemeral
(03:45):
database actually is secure.
is not sharing your data, who they're sharing your data with, what the capacity for it is,how they're enriching that with other information can potentially be used as a way to
target you sometime in the future.
(04:06):
We give up information all the time, right?
So, I mean, we social media networks, we use streaming media.
I mean, all these things require you to sign an end user license agreement.
something as simple as your email.
You're giving up tons of information,
mean, anything that's free or low cost is because you're part of the product offering andthey are taking that information and the data and reselling it or using it to re-advertise
(04:30):
back to you.
And based upon the information that's there, they're creating demographic profiles aboutyou, finding information about you and figuring out the things that you like to do.
Like, for example, if you're wearing an Apple watch or a Fitbit or a Garmin or a Polar orany of the thousand devices out there that can
look at the number of steps that you take, what your exercise patterns are, you know, doyou actually go into your watch and click the little button that says, I want to walk
(04:53):
right now.
Like that informs those people as to what your commitment is and what your repetitivecycles are and how you're going to move in those things.
And right around January, all these devices get like spikes because everyone's going intothe new year thinking they're going to create resolutions.
Right.
And there's a million different ways to look at these components.
So Apple did a really good job of going through and kind of
(05:15):
collecting all these different data sources and created Apple Health.
And they have their own watch, right?
The Apple Watch is a really great lifestyle product.
Whether or not the data it collects from a signaling perspective on your health is good,that's debatable.
I think over the last few years, it's definitely improved.
But what they really become is this data aggregation function.
(05:37):
So they can take all these different signals and put them together and give you acorrelated data output.
And you can actually
Share some of this data with your doctors or your medical professionals or your fitnesscoaches or your nutritionist.
Those things are fantastic, right?
And there's all kinds of other signal data that can come in down towards you.
(05:57):
So diabetics back in the day used to use continuous glucose monitors that would go throughand track and look at the blood sugar level in your body at any given point in time.
And it was, it's basically been a miracle for type one diabetics.
One of the main problems with type one diabetics is that they take too much insulin.
and don't get enough sugar in their body and then they crash.
And when your blood sugar really gets below 50, actually really below 60, most peoplestart feeling kind of woozy and out of it.
(06:23):
And type 1 diabetics have kind of trained themselves to go, know, I kind of feel this way,I might want to take this.
But over time, you don't know if you're feeling that kind of woozy and that kind of tiredbecause your blood sugar is low or because you're tired or whatever.
So they will sometimes eat when they don't need it or they would sometimes take insulinwhen they don't need it.
It's that combination of finding the right values
(06:44):
that have helped those people in that space actually get something of value out of thesetypes of wearable devices.
If you couple that and take the alerting functions that tell you what your glucose is, sothey know when to take insulin and when to eat food, that's great for type 1 diabetics.
Now let's go to type 2, or like where most Americans are, pre-diabetic.
(07:05):
Yeah, I mean, we eat too much, we drink too much, we don't move enough, we don't sleepenough, we're way stressed out.
So there's a lot of companies out there that created these fitness programs that werewrapped around the idea of taking these wearable devices and looking at measurable ways to
find output to help people improve their health outcomes.
(07:26):
And what they would do is they'd slap these CGMs by Dexcom or Libre or somebody else onthem and they would resell them back as a service.
So if your insurance company is paying for this, you're probably paying 35, 40 bucks amonth, these little sensors that get dropped in your arm or your hip or your...
Are these like the inside tracker or whatever?
Like it's literally, it's a disc that you stick into your arm and it's like tracking yourblood sugar all day long and reacting to whatever you eat.
(07:49):
Yeah.
so it tracks that and the idea is that you're supposed to put in what you're eating andwhat your activities are into some of these apps so that you can track it but for a long
time these apps didn't have anything in there.
Well, take all these different wearable functions that you have in there.
So I mean you're wearing an aura, which aura is actually fantastic at doing passive SPOblood oxygen monitoring and it does something that a lot of other wearables don't.
(08:14):
It measures your body temperature.
Which was great during kovat because it was a really good indication as to whether or notyou were you were potentially getting sick And they could look at your heart rate and if
your temperature went up and your heart rate went up at the same time It's pretty likelythat you have an infection.
I own one I don't wear one anymore because I work out and I had to take it off all thetime I broke one one time So I felt pretty bad about breaking this $300 wearable But I was
(08:41):
taking that
I was wearing my Garmin, I was wearing my Fitbit, and I was wearing another one called theWeeby something that supposedly thought it would track hydration levels.
So yeah, so I had one watch in one arm, two on the other, the Aura Ring, I have sleepapnea, so I a CPAP.
(09:06):
So I have that signal coming in bound towards me, and I have no way to collect andcorrelate this data.
Yeah, yeah.
And Apple Health does about half of those and pull those things in, but it doesn't giveyou the whole scope of what's there.
It just takes the summary data that the application itself picks up from those devices.
If you connect to the APIs on the back end, which stands for Application ProgrammingInterface, you can actually gather a ton of information.
(09:33):
So Garmin devices have like 30 or 40 different signals inside of them that you can pullfrom the API.
and you can start looking at them over time as individual signals.
But when you open up the Garmin app, you don't see all those signals.
You have no way to see all those signals.
What you see is a summary of that information and the output that's shown up.
And that's going to be based upon whoever the algorithm experts are at Garmin giving youthat information, who may or may not have medical degrees or training.
(10:01):
ask you about because I mean, you even mentioned that, that Apple has sort of come a longway in terms of their tracking and that information.
To some degree, I want to know how much I can trust that what I'm getting back isaccurate.
want to know, I I'm sure I could use information better.
What I mean, just breaking it down really simply.
Initially, I thought the, the aura was lacking in terms of its ability to track stepsbecause it was always wildly different than whatever device I was using.
(10:28):
But then.
really is the other device any more accurate?
Like I don't know.
I'm not going to literally count every step I take to match up at the end of the day tohow close they are.
So for me, I landed on this is the one I'm going to use.
And it's basically going to be my, my tracker.
Like if, if, if it said I did 10,000 steps today, then tomorrow I'm going to do 10,001 onthis device, not on another device, not on something else.
(10:50):
Like this is my baseline.
This is what I'm going to use and whether it's accurate or not, doesn't matter.
It's going to show me my, the variation in my activity.
based on the current, whether accurate or not, measurement.
Is that sort of how people should use these things?
Absolutely.
So back in the olden times, the concept of a foot was not 12 inches.
(11:13):
It was someone's foot.
And if that foot was Ted's foot or Bill's foot, and they're different sized feet, you'regoing to wind up a different size walls.
And if one's person building one wall on one side of one's building one on the other, youmight wind up with some very uneven architecture.
Right.
The same thing goes for trackables and wearable devices.
So how they measure them is typically based on some type of accelerometer function.
(11:37):
it's the, imagine you've got a little sphere with another little sphere inside of it, andit kind of rolls around.
When it bounces off the edges, that's when it goes, this person has made a transition.
I think this might be a step.
And it's a guessing game.
Because I can sit there and do this, and it counts as steps.
yeah, I was gonna say a friend of mine went to a Performance one.
(12:00):
I hadn't walked all day but clapped all night and she had like 3,000 steps because she wasjust clapping to perform.
Yeah
right.
mean, so there are actual devices out there, pedometers that you can put on your foot.
You can wear on your ankle.
So I've actually done this before where I had to go and had to buy a bigger band.
I have ridiculously tiny ankles for a big man.
(12:21):
So an extra large garment or Fitbit band fit just fine.
And I could actually walk one path, check it.
And I'd carry my iPhone with me, which also had a pedometer on it.
And I would look at that as it goes across and I could compare them side by side and Icould definitely see a difference when I would wear it on my ankle versus when I keep
(12:41):
something in my pocket, which is when I have something on my wrist.
And I mean, the difference, it wasn't massive, like it was within five, six, sevenpercent.
It wasn't huge.
But but that's, know, well worth looking at, because if you're looking at these thingsover time, know, seven percent adds up seven percent.
(13:01):
Body fat is much different than 12 % body fat or 15 or 25.
Yeah.
So these things are tracking all this data there, you know, on paper, it's to help usimprove, help me, help me know how to be 1 % better.
What are the numbers of steps I need to take tomorrow to beat today?
What do I need to eat tomorrow to beat today, et cetera, et cetera.
(13:23):
On, you know, again, on paper, on the surface, this is great.
It's helping me.
It's, it's either a something I can afford cause I bought it or I, you know, it fits mylifestyle.
And if they are collecting data, I would hope that it's to give me a better product,right?
They're going to, they're going to improve the thing.
They're going to add to it.
But I imagine there's some nefarious things going on as well.
So again, like we said, most of the time, if it's cheap or low cost, you're the product.
(13:46):
And even with these devices, you're still part of the product.
So they're going to use your data to make their product better and to figure out ways tocreate things that are better, to market materials to you in a better fashion, to try to
extract more value out of you.
can look at Fitbit and you can look at Garmin and Polar, for example.
(14:06):
If you want the good version of their app, that actually gives you more information andgives you exercise programs and everything else.
Great, pay you seven bucks a month, and then you get access to all these other tiers.
And they're going to go through and they're going to send you targeted messages to try toentice you to actually buy into these service layers.
But there's another angle to this, and it's that your data might be being resold somewhereelse, or it might be being scraped by somebody else.
(14:35):
Some of these AI inference models, know, ChatGPT, Claw, DeepSeek.
people out there going through and gathering this information from publicly availablesources.
And theoretically, your data should be locked in and secured in those sites.
Most of them require multi-factor authentication, which is not easy to get past or breakthrough.
(14:56):
But part of those ULAs also say that your data sometimes gets anonymized.
But that anonymization of data can change over time.
they're all based upon the idea that they're going to obscure what they call PII, whichis...
your personally identifiable information.
And typically speaking, that comes down to a telephone number or an email address or aphysical address, but even an IP address, like what the IP address is on your computer,
(15:22):
that can be considered PII, depending on which, if you're looking at GDPR or California'sversion of it.
When you start looking at these things in context and you start taking that informationand kind of overlaying it over the top of each other and then giving it into an aggregate
or somebody like Apple or Google or Samsung has their own health app,
you start exposing yourself to multiple different layers and areas in which yourinformation can be extracted and pulled in.
(15:48):
And if you look at things like the current administration, who seems to play a littleloose with the rules when it comes to protecting people's information and really
information sovereignty, you're going to see a lot more things tend to pop up andpotentially be used against people.
(16:09):
I'm glad you went there because this is something that I've heard a number of womentalking about particularly because a lot of these things will track their menstrual cycle.
And they're like, I fuck this handmaid's tail bullshit.
I'm not reporting a goddamn thing to a goddamn person because it just takes the wrongperson with the stroke of a pen deciding that that is now the government's business and
not mine.
(16:29):
Is that realistic or is that okay?
How realistic is that?
How concerned should people be about that?
Very.
I mean, honestly, they should be.
So, I mean, it depends if you're if you're worried about pre-existing conditions andyou're talking with the US medical health care or the US medical system just in general.
I mean, it's fucked and it's stupid anyways.
And we pay three times the cost for the same or for worse health care outcomes than a lotof other countries in what you would consider to be the first world nations.
(16:59):
And we complain about insurance companies and yes, they had overhead.
They're not making massive profit margins, but they're still adding 25, 30%.
But that doesn't account for 200 % increased cost.
And that comes down to the fact that we've got a for-profit medical system in thiscountry, and it's not designed to provide care as its primary function.
(17:19):
Its primary function is to make money.
And again, if it's free or low cost,
you're the product.
hence insurance, why these things get pumped through and you don't want it paying a lotunless you don't have insurance.
So there's a lot of pieces that go into this and all these companies are going to try touse as much information as possible to deny having to pay your claim, whether you're a man
(17:41):
or a woman or anything else.
And we talk about women having their menstrual cycles tracked.
Well, there's other ways to infer whether or not somebody is feeling sick or has otherconditions or the things that they're doing.
based upon their movements.
And if you look at things based upon movements, and you look at things based upon thestuff that they would normally do, and you can create tracks and trends, you can create an
(18:01):
inference model that shows how these things actually work.
And by correlating all that, it's not enough to just not let it see certain bits ofinformation, because they can infer things from the other ANTLR information that is
available in the regular operation of the device that you're
(18:21):
So these insurance companies could theoretically use a tool to gather information abouthow often you were sick last year, whether you went to a doctor and a card was swiped or
not.
But they can tell based on this biometric data that you got sick X amount of time.
So you are a higher risk.
So your premiums are now increased by 20 % or whatever.
and imagine that for things like life insurance and automobile insurance.
(18:43):
mean, everything that has a cost center basis where you may or may not be subject tointerpretation of risk is going to be affected by all these elements.
I connected cars have the same aspects, right?
Like you can, I think Allstate has a thing called drive wise, where if you put it on yourphone, it tracks how quick you drive.
(19:06):
in different conditions, you know, are you breaking the speed limit?
Are you moving these things across?
But the idea is that if drive wise is on, you're doing well, we're going to give you adiscount.
if the inverse is true, so if you're a reckless driver, they're going to use that againstyou.
And if you get in a car accident because you're driving recklessly and they see this thinggo through, your insurance company might actually go, well, you're responsible for this.
(19:29):
So we're going to set that to pan these claims.
Any bit of information that you hand over and give to folks in this fashion, it's going tobe litigated in court and how they pay things out is going to be used as evidence against
you.
And don't fool yourself that the companies aren't looking at ways to make this happen.
There's just certain legal reasons today that they can't, but that will change.
(19:51):
Yes, I think this shines a big bright light, not to turn this into politics because youcan get this shit on CNN or whatever, but like when you look at the inauguration pictures
and there's Jeff Bezos, Mark Zuckerberg, Elon Musk, like these tech bros know that if theyalign themselves with those in power, it's going to grease the wheels heavily to give them
more access to this information to not only sell us more products, but to actuallymanipulate.
(20:16):
many aspects of our lives and the things we pay for and the amount we pay for them.
Yeah, I mean, the thing that prevents people from running amok with your data islegislation.
It's not legal to do it.
Well, if that changes, it goes in their favor.
I mean, you can look at the H1B visa thing that they're pushing for right now, and you'vegot Trump supporters getting riled up about this and being angry that H1B visas are being
(20:42):
approved by these tech bros, but these tech bros bought their way.
into the Trump administration and they're going to stay there.
Like they're not going to, they're not going to drop out.
mean, they've made an investment and I'm not going to debate their politics because youdon't have to.
This is all business motivation and economically motivated.
And you can call it greed.
can call it whatever you want, but they're not strictly doing it for the social goodbecause if they were doing it for the social good, we'd probably have turned off social
(21:12):
networks a long time ago.
Yeah, absolutely.
Okay, so what do do to protect ourselves?
Because I mean, even as I'm sitting here with all these devices, I know that when I setthem up, I set up various privacies to do what I wanted them to do.
But I also know that I have them talking to each other.
My watch talks to my personal trainer app, which talks to Apple Health, which talks towhatever.
(21:34):
So I mean, while on each level, I might have some safeguard in place of what I don't wantshared, I am sharing it with other things that are maybe working around some of those
rules.
How do I protect my information as much as possible?
Yeah, so, it's the best way to do is to completely and totally disconnect and go live in acabin in the woods.
(21:54):
right throw these things in the river and go yeah go live in the mountains right
and disconnect from society, stop using credit cards.
So the practical reality is that some of this information is going to be collected.
But what you should get in the habit of doing is sanitizing the information feeds andsanitizing the information that you have from a historical perspective.
(22:15):
So pretty much every company out there has a privacy and security department that can gothrough and
Eliminate your data from their system now won't get rid of the anonymous data and howthose things are put in but the PII Can be yanked out of the system and destroyed So if
you decide i'm not going to use this device anymore Make sure that you follow theinstructions to ensure that all of your data is destroyed and email their support team and
(22:41):
saying I would like to make sure that all of my data is taken out the Many states havelegislation around this and they have to kind of push in that direction but most these
companies kind of you know, don't want to get into a
a fight that's going to be related to something like GDPR or California State ofProtection Privacy Act.
Those are things that you definitely want to do from a day-to-day basis.
(23:05):
However, make sure you have multi-factor authentication set up on every single one of youraccounts.
So if you don't know what that is, just type in MFA anywhere and you'll see it pop up.
But basically what it does is you log in with your username and password and then theysend you a second authentication method.
So that could be an app on your phone.
They could send you a text message.
They could send you an email.
(23:25):
It doesn't matter which one it is.
Use it and use it in everything.
And if you can, rotate those things through.
Next, set a strong password on everything.
So don't reuse the same password over and over and over again.
You want to try to use different passwords for different accounts because you are yourpersona.
(23:47):
Your identity online is...
going to be based upon either handles that you have on different accounts or emailaddresses.
And that's typically how companies will track you or telephone numbers.
But the idea is that these collections of things equal you.
And I used to work in ad tech for a long time.
And we would see anonymized folks, people that looked like they were anonymized comingthrough with no cookie data, anything like this.
(24:09):
But because we had a record of their IP addresses and we had a record of things that theydid before, we could actually build these anonymized profiles.
And then if we get an email address, we'd start stuffing these email addresses in,possibly this, possibly this, possibly this.
And this is before AI inference and neural networks existed.
This is all what you would call expert systems in the manual stitching together phase.
(24:31):
They're a lot better now.
You don't have to do all that.
It just happens automatically.
So make sure that you are using good password protection.
great good password policies and that you're enforcing yourself to use MFA and make surethat you delete as much of your bad data as possible.
(24:55):
So even if I do all that, I I feel like those are the steps, those are going to keep thebad guys out, right?
Like the people that are out there looking for things that they shouldn't have their handson.
I mean, I guess the answer is going to be very subjective.
But how do you trust that the company, I people on the other side of this, how do we knowwho to trust?
(25:16):
Yeah.
Yeah, you don't trust anybody.
So, I mean, there's a really good case.
There's a couple of good cases where Apple's gone through and said, we're not going toprovide access to iPhones for certain people, for certain terrorist groups.
And they block people out and they push back and push back and push back.
And other security companies came in and figure out how to crack the iPhone security,which iPhone claim was uncrackable.
(25:40):
So we know that security is an illusion and bullshit.
So don't expect these companies to continue over a protected period of time to protectyour data and your interest.
Just assume that these things are going to come out.
And honestly, if you really want to do the best thing for you, stay in good shape, deleteyour data, do good password control history, make sure that you're not leaving things out
(26:07):
there that are easy to find you, and use the privacy settings in your social mediacontent.
Right?
that's a big thing.
mean, people don't realize that all of their posts go out there on the public Internet andthey're searchable via Google and all these AI systems are supposed to respect this thing
called robots.text on every website out there.
That says you can and can't get this information, but nobody respects anything.
(26:29):
Facebook's AI crawler flat out tells you it's not going to respect anything.
So, you know, if you have these bits of bits of information or if you have a website or ablog post or something else, use proper controls.
put a good web application firewall in front of it.
Do something good as a small business to try to make sure that your users are as protectedas possible.
(26:50):
And I'm not saying that data breaches and data leaks aren't going to occur.
They're definitely going occur.
But it's something that you're just going to have to be aware of.
I would also highly recommend spending the $50, $60, $70 a month for an identityprotection service.
They all have some type of value around them.
The likelihood that you're
going to have your stuff hacked and someone's going to take your social security number oryour banking information, in your lifetime is high.
(27:16):
I am a cybersecurity pro.
Like I was chief product officer for a company that made cybersecurity products.
And I've had my credit card stolen twice.
I've had my social security numbers bounce from the dark web a dozen times.
And like I have all these control systems and all these things in place and I use goodpaths for sanitization and multi-factor authentication and it still happens.
So buy insurance and put
(27:39):
put effort forward to ensuring that when this thing does happen, you have recourse and away to try to protect yourself.
is there anything we didn't talk about that you want to talk about?
I'm sure.
Yeah, yeah, yeah.
the next topic it'd be fun to go into and actually talk about how I was able to go throughand take some of this wearable data.
And I wasn't good at getting information from Garmin or Fitbit or anybody else.
(28:03):
So a couple of friends of mine, we built an application called UpRove.
And you can find that he.uprove.me.
It's basically a data aggregator that takes data from all these different sites and allthese different signal angles.
And it also allows you to share with other folks.
correlates and collects this data and unique in different ways to show things over time.
And I used it to drop a third of my body weight in four months by going through andtracking and looking at how these things went through.
(28:31):
And there's a long story around it.
But maybe next time we jump into that.
Well, I was right.
That was fun.
My thanks to Jason Hayworth for joining me here.
like to learn more about his cyber security consulting work, can check outHayworthConsulting.com.
The link for that is in the show description for this episode.
You can also check out UpRove, which he mentioned there at the end of the interview.
The link to that is also in the show description.
Stick around for a minute.
(28:51):
for hanging out with me today and listening to my conversation with my old friend Jason.
That is going to do it for this episode.
I've got to go.
I've got to set up a whole bunch of two factor authentications that I have not set up andnow I'm terrified.
So I'm going to go do that.
You do that as well, but come back next week to the fitmas.com.
That's where we'll have another new episode for you.
And before you go, if you do know of anybody who could benefit from hearing thisconversation that you just listened to, please do share it with them.
(29:14):
You can find the links to do that at our website, thefitmess.com.
We will see you in about a week.
Thanks so much for listening.