Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 2 (00:06):
And usually it's Tech Thursday, but Marsha Collier has the
night off, she is out of town. But we're still
going to discuss some of these majorly important tech issues.
And if you've been listening to Later with mo Kelly
for any amount of time, we have been harping on
the idea that your data is the most important thing.
(00:28):
It is more valuable than your Social Security number, is
more valuable than just about anything else that you may have,
because it's the key to everything. It is what makes
the tech world go round. Your data. You know this
because you've been listening to Later with mo Kelly. So
I got a real chuckle when I was checking out
(00:50):
this article in the New York Times supposedly, you know,
like the news leader. It's like you're kind of behind
on this issue. The FTC, the Federal Trade Commission, said
on Thursday, as in Today, that it found that several
social media and streaming services have been engaging in a
vast surveillance of consumers, including miners, collecting and sharing more
(01:15):
personal information than most users realize. That should surprise absolutely
nobody who listens to this show, because we have been
pounding that idea again and again and again. Marshall Collier
has talked time and time again about protecting your data.
The data is what they're looking for. Their data is
what they're trying to steal. Their data is what they
(01:37):
want in exchange for something else. You think you're getting something,
but they're getting your data in return. She always talked
about and I know you can hear in my ear
right now, she always talked about reading the terms of
service all the way down to the end to figure
out what they're getting in exchange for your access to
their website or their service. The findings come from a
(02:01):
study of how nine companies, including Meta which is Facebook
and Instagram, YouTube and TikTok, collected and used consumer data
as in your data. Because we're all on Facebook or Instagram,
we all use YouTube. A lot of folks, maybe the
younger generation more so than the older generation, will use TikTok.
(02:21):
This applies to everybody listening right now. Everybody these sites
they offer free services, meaning you can use Facebook, you
can use Instagram. You're not paying to use it unless
you want to get like a check mark. I think
Facebook you can pay to get a check mark now,
(02:43):
but you can use these sites without paying money. So
on the surface they're free. But these services profit off
the data you feed into it by directing advertising that
targets you. You separated by demographics and other portions of
(03:04):
your personality, you know where demographics, ethnographics, all those kind
of things. The companies also something we talked about, you know,
but now's the big deal because it's in the New
York Times today. The companies also failed to protect users,
especially children and teens. Instagram, what do we talk about
(03:24):
just days ago? Instagram is now implementing these teen accounts
because they knew that this backlash was going to come.
This study is almost four years in the making, and
according to the FTC, it's the first holistic look into
this business practice of a lot.
Speaker 3 (03:43):
Of these online platforms.
Speaker 2 (03:46):
And these are multi billion dollar multinational corporations. This is big,
big business, and the FTC said the report showed the
need for federal privacy legislation and restrictions on how companies
coll and use data. We have been talking about this
at nauseum for years, about protecting your data as best
(04:08):
you can, because that's what the companies want and have
you noticed and it just Oh, I got so mad.
I think I noticed it last week. If you use Facebook,
you'll check your notifications, right, okay, see who's responded to this,
or see who's supposted that. Have you noticed that they're
putting ads in your notifications, like you're getting notified and
(04:32):
they're promoting a product in your notification. There's absolutely no
way that you can escape any of these ads. But
that goes back to the sites that you'll visit, the
things that you talk about. Your phone is listening to
the conversations that you're engaging in. All that's the data
that you're putting into this algorithm which spits back out ads,
(04:54):
spits back out products for you to buy, things to
look at, news to read, people to connect with. It's
almost like you get the the suggestions of people, people
you might know. Oh, let me just figure that out
on my own. I don't need suggestions. I don't need
suggestions on people to become friends with. Why because they
(05:16):
want the increased user engagement, Well, the deeper connection to
how they are getting crucial data about you, name, birthday,
where you were born, every bit of meta data they
can get on you is by way of all the
(05:37):
new sub and intra apps, the games, the if you
see people doing those cartoon caricatures of themselves, the AI
images of themselves, all of those things that you opt into.
Speaker 4 (05:53):
Everything that you opt into on any one of these
social media sites that seems fun, you know, you answer
a trivia question or things like that. That's where they're
getting a lot of this data that they are selling
bar none.
Speaker 2 (06:06):
Not only that, in addition to that toilette and that's
a great point, but not only that. Whenever you use
a third party log in, so let me log in
via my Facebook account, let me log in by my
Googgle account, they have to share that information and with
it all the data associated with your account. And it
may be more convenient for you to do it that way.
I know I've done it. It's not smart, but it's
(06:29):
easier to log in, Yes, log in via Google, login
via Facebook. But that data also goes out the window,
and you've shared it again with someone. You think you're
going there for one reason, but you're also giving them
things that you didn't explicitly or not consciously aware of
that you're giving them. This is going to require more conversations,
so We're going to hit this again on the other
(06:50):
side of this break. It's Later with mo Kelly, as
we talk about data collection, data dissemination, and how everything
that you hold dear, everything that you know about yourself
or I've even forgotten about yourself, it's out there for
these online platforms to use and abuse.
Speaker 1 (07:07):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 2 (07:14):
AM KFI AM six forty It's Later with mo Kelly.
We're live everywhere on the iHeartRadio app. In last segment,
we began a conversation about how data is king and
how it's the most valuable currency and commodity out there
for these big corporations. And they get your data through
(07:37):
your online habits, of course, the websites in which you visit,
but they can get more in one fell swoop from
your social media platforms. The things that you do on Facebook,
your presence on Instagram, YouTube, TikTok, not just your information
on your profile, but you're viewing habits, who you're friends with,
(08:01):
the games that you play like on Facebook. You know
you play this game or you'll put it a picture
and make it make you older or make you younger
or turn you into a dog or something like that.
Speaker 3 (08:13):
That's all a.
Speaker 2 (08:13):
Part of the data mining and in the years previously,
going back to the story about the FTC study about
how these online platforms are using and misusing and abusing
your data and selling it. In the past, FTC tried
self regulation and left it up to the different online
(08:35):
platforms to self regulate, and the FTC concluded in its
recent report that self regulation has been a failure. Duh,
don't ever expect corporate America to self regulate or don't
expect a corporate America to do the right thing as
it should, because it's not wired that way. It's not
meant to work that way. Give you an example YouTube,
(08:58):
which is all by Google, TikTok, Meta which owns the Instagram,
What's App, Messenger, and Facebook. They are all part of
this and the FTC requested data from each company for
operations between twenty nineteen and twenty twenty and then studied
how the companies had collected, used, and retained that data.
(09:22):
Included in the study with the streaming platform Twitch, which
is now owned by Amazon. If you're not familiar with Twitch,
you can get on Twitch and you'll see people either
playing music in a video format where you can watch
a DJ play music, or you can watch someone who's
a gamer playing actual different video games. It's really strange
how people will actually spend hours watching someone else playing
(09:45):
a video game.
Speaker 3 (09:46):
Really strange, but people do it.
Speaker 2 (09:49):
There's also Snapchat and Reddit and also x slash Twitter
that was included in the study. The study did not
disclose company by company findings, but companies have argued that
they've tightened their data collection policy since the studies were conducted.
I don't know if that's true, but you did know
that we talked about Instagram. They have these teen files
(10:12):
that they rolled out, and Facebook has been saying that
they're going to have some sort of age modification or
verification in their online platform, but ultimately they're still getting
data now. Just because they're trying to make it a
little bit more difficult for miners to use the platform
or for adults to reach miners on these platforms, they're
(10:37):
still collecting the data. They're still selling the data. I
talked about how now if you're using Facebook you'll get
ads in your notifications.
Speaker 3 (10:45):
They're just mainlining them straight to you.
Speaker 2 (10:47):
It used to be where it would be in your feed,
if you would scroll, you would come across an AD.
If you were on Instagram, you'd scroll your feed and
then you'd come across an AD, maybe every four or
five items. But now they're in your notifications. You cannot
in any way get around them or get away from them.
But it's all about your viewing habits, the types of
(11:11):
sites that you will connect.
Speaker 3 (11:12):
Like, for example, if you were.
Speaker 2 (11:14):
To I'm trying to think, let's say you want to
sign into a third party site. Let's say you want
to sign in to the New York Times or use
them as an example, you can sign into the New
York Times just using your Google account. Not a good idea,
But at the same time, then Google knows you use
(11:37):
the New York Times, and all that data that you
have with Google is also shared with the New York Times,
and lo and behold, New York Times articles will start
popping up anywhere and everywhere where you get notifications. They
could start popping up on Facebook because that information is shared.
And so this just basically it's unavoidable. The dangerous part
of it is, and I've said this many times, if
(12:01):
you think that your data isn't out there, you are
deluding yourself. It used to be people were more concerned
about their Social Security number. No, it's not the social
Security number, it's just your data. In totality, you know,
I know everything. If I'm an online platform, I know
everything there is to know about Mark Ronner. I know
everything there is to know about Stefan just from his
(12:24):
online habits. I mean, like, oh hell, I mean, have
you seen the story today about the lieutenant governor of
North Carolina and his online habits which have come back
to bite him in the ass?
Speaker 3 (12:36):
Oh yeah, no, pun intended.
Speaker 2 (12:38):
But I'm saying that's a perfect example of how that
data and if you don't know the story Mark Robinson,
who is a Republican nominee for governor in North Carolina,
there's a controversy which is now just now developing, where
old messages that he left on a pornographic website and
(12:58):
for some reason, he said up for the website using
his real.
Speaker 3 (13:02):
Name, full name, and an email.
Speaker 2 (13:05):
Address which he's used for all these other websites for decades.
Speaker 3 (13:10):
So the website's name, by the way, newd Afries.
Speaker 2 (13:13):
Yes, I mean, this is like a perfect example because
that data is out there, it was used and depending
on your political persuasion. You can say it was misused
to reverse engineer his online history, and now it is
out for the public to see. If you don't know
(13:35):
the story, Mark Robinson is denying the messages. He is
going to stay in the race. But I don't want
to get too sidetracked with that. But I'm saying that's
a perfect example of your online habits and that data
is of great value to corporations, even opposition research. It
wasn't in October, surprise, but damn near close. It's everywhere
(13:58):
now too.
Speaker 5 (13:59):
If you open in any social media or the news,
you're going to see this and the details are I
guess you could call him salacious, but that kind of
under sells it.
Speaker 2 (14:08):
You know, it really under sells it. You know it's
a five years ago, I'll say ten years ago, it
would be a career ending story. Today is like, uh,
maybe two news cycles. Maybe maybe he's not going to
drop out of the race. But the point is his
(14:29):
online history will never go away, will never go away.
The difference is he was going for a high profile
office and people could find that history and use it
against him. But all of your data has value for
various and sundry reasons. But all you can do is
remember that your data has value, and your data can
(14:52):
be used against you at any time, So always be
cognizant of that as you you know, do your stuff
online and look at your different Nude Africa websites.
Speaker 5 (15:03):
All say is that I admire his parent ability to multitask.
Speaker 2 (15:07):
Look when I saw that story and I saw that
he used his full name, his full name, and this
was before he, I guess had a career in politics
us his full name and the email address for all
of his stuff dating back decades. We all have that
(15:29):
one email address we've had forever.
Speaker 3 (15:30):
Right.
Speaker 5 (15:31):
Oh yeah, it's bright, real bright.
Speaker 2 (15:36):
Look it's not my problem, but it highlights the point
of how important it is to protect all our data.
Speaker 3 (15:43):
It's Later with mo Kelly KFIAM six forty.
Speaker 2 (15:45):
We're live everywhere on the iHeartRadio app and when we
come back, we'll tell you about the ten companies with
the happiest employees. And let's just say that someone in
the studio has worked for one of them.
Speaker 3 (16:00):
I wonder who that is.
Speaker 1 (16:01):
You're listening to Later with Moe Kelly on demand from
KFI AM six forty.
Speaker 2 (16:07):
And we all we all, assuming you're still in the
workplace workforce, we all want to work somewhere where we
feel valued, that we're happy when we're or at least
content when we get out of bed in the morning
or in the case of Mark Runner, in the late afternoon, okay,
and decide to go to work. We want to know
that where we're going is worth it all. And I
(16:28):
always wondered where's the best place to work? I know
that's subjective, but you know, I wonder where is the
company that everyone or a large majority of employees say, Hey,
this is the place to work, we are happy to
be here. Well, according to this list, and it's from
(16:48):
Fast Company, it lists the top one hundred large companies
and those are companies with more than five hundred employees.
As far as the happiest employees. Of course, being happy
is subjective, but you could take it for what it's worth.
And there is a company on this list that someone
(17:11):
in the later crew or me has works for, so
we will be able to get some first hand insider perspective.
Coming in at number ten as the happiest place to work.
Speaker 5 (17:31):
Sales force, I guess is that an actual business name
or is that just a generic sales. No, no, no,
it is a company. I couldn't tell you exactly what
they do. Did they drink out of cans that say
beverage on them?
Speaker 2 (17:45):
No, No, that's the company. In fact, when I'll playing
that Gavin Newsom audio earlier, he was sitting down with
the CEO of sales Force. Okay, yeah.
Speaker 3 (17:56):
Number nine top Golf.
Speaker 2 (18:04):
Now, I don't know what would be like to work there,
but it's fun to hang out there and swing the sticks.
Speaker 3 (18:10):
I enjoyed that, and I'm a horrible golfer.
Speaker 2 (18:13):
But if you're just like a driving range and you
have like six drinks in you sure, why not?
Speaker 3 (18:19):
Yeah?
Speaker 2 (18:20):
Put me down for the nineteenth hole, right, It.
Speaker 3 (18:22):
Does not matter how well I do. Number eight.
Speaker 2 (18:30):
I have no idea what this place is. Hub Spot anyone? Anyone?
Eueler Jueler Spot. Yeah, you're just making up sometimes. Look
I wish I was, but no, hub Spot. It sounds
like a porn place, That's what I was gonna. Yeah, yeah,
uh Spot it is. It powers your customer support, sales
(18:56):
and marketing with easy to use features.
Speaker 3 (18:58):
Like live chat. I don't know.
Speaker 5 (19:01):
These names are so generic now, they all sound like
Russian ops.
Speaker 2 (19:04):
It's a leading contact relationship management, CRUM marketing, sales and
customer service software company with a simple mission to help
companies maximize sales and grow better.
Speaker 3 (19:14):
How would bore this?
Speaker 5 (19:15):
But Jesus out of me, you have something against maximizing
sales and growing better.
Speaker 2 (19:19):
No, I'm just saying it just wouldn't be an exciting
place to work.
Speaker 3 (19:22):
That's I just part of me.
Speaker 2 (19:24):
As far as my happy has to do with a
degree of creativity and agency autonomy, I could do something
to stand out.
Speaker 5 (19:34):
I'm not money motivated by and large. I'm going to
see you in a HubSpot baseball cap sooner or later.
Speaker 2 (19:39):
Well, we'll see.
Speaker 4 (19:40):
Everyone isn't as exciting as you most Some people just
want a nine to five where they go in, they
don't clock in, they clock out. HubSpot AI powered customer Service. Yeah,
I's like working at a bank. Yeah I have.
Speaker 2 (19:56):
I've always gotten out of bed with the goal of
loving what I do, loving it and for many years.
And I know Twala you know this, and Mark you
know this, and I assume Stephan you know this. That
means working jobs which don't usually pay a hell of
a lot. That's the trade off. That Yeah, the trade off, Yes,
because I'm quite sure I could have come out of
(20:16):
school become an accountant or gone into the finance well,
had a Georgetown business degree. I could have walked into
a job because they were recruiting people out of Georgetown
left and right. I thought it was going to be
a corporate lawyer one point, and that would have been
going to law school. I'm quite sure I could have
done that, but I would have been bored to hell.
Speaker 5 (20:36):
You can have something stable and lucrative, or you can
have adventures. It's not very often you get them both.
Speaker 2 (20:42):
I get I've had plenty of adventures, and I've had
a lot of instability over the years as a plus
minus trade off.
Speaker 3 (20:49):
I've gone through foreclosure.
Speaker 2 (20:51):
I've lived the life of working in the music industry
and spending money here and there and bottle service and
all the things that people see in music videos.
Speaker 3 (21:01):
I have lived that firstthand.
Speaker 2 (21:04):
The hot tubs, the stories of what goes on behind
the scenes. Haven't been to a Puffy p Diddy party,
so get that out of your mind right now.
Speaker 5 (21:11):
I've been to the early parties, just not to the
after parties, not to the closed door parties.
Speaker 3 (21:15):
But I've been to a Luther Campbell to live crew party.
Speaker 4 (21:19):
Hey, part don't know what kind of Christmas parties HubSpot
is having. Taco Bell was having sex parties. We don't
know what Hubbspot was. Say what?
Speaker 2 (21:28):
Yeah they were, they were they were having sex part
But I'm not going to work at Taco Bell just
for the sex party. Okay, I might Britos?
Speaker 3 (21:37):
Okay.
Speaker 2 (21:39):
Number seven S three Does anyone know what S three is?
E S R I S three three You have something
S three S three tell us real quick twelve.
Speaker 4 (21:59):
S three Yes, yes, gis mapping software. Oh all right,
sounds like fun.
Speaker 2 (22:06):
Number six of the happiest place to work, well, the
place with the happiest employees proof point? What are these companies?
Proof point?
Speaker 5 (22:21):
These were created by a random name generator.
Speaker 4 (22:25):
It proof point protects data, which is really important.
Speaker 2 (22:29):
Yes, we know how important data is. Gets your proof point?
Speaker 3 (22:32):
All right?
Speaker 2 (22:33):
Number five At least I've heard of this company, pay Calm.
I've seen their television commercials. Yep, happiest place to work
where they have the happiest employees. Coming in at number four,
and these are large companies with employees numbering more than
five hundred.
Speaker 3 (22:59):
Else C V A E. L. S.
Speaker 2 (23:02):
E v I E R I gave it a French
pronunciation l cv A oh.
Speaker 4 (23:09):
Il CIVA another global information analytic company, These big global
analytic data protection mapping g I s AI companies, they
probably are full of natural born freaks.
Speaker 3 (23:24):
So everything goes back to sex.
Speaker 2 (23:26):
Huh. If they're probably if the happiest place, there has
to be a sex variable in here somewhere. Number three
of the companies with the happiest employees in the world,
Number three is I don't believe this ADP No, how
(23:47):
what for me? That's just got to be boring. On
top of boring, it's like paper shredders are us ADP
coming in at number two of the large companies with
the happiest employees in the world.
Speaker 5 (24:06):
Adobe, Okay, I know a guy who works there, and
really and he's much more tranquil than I am.
Speaker 3 (24:13):
I believe it. Okay.
Speaker 2 (24:14):
Maybe they have this great sort of I don't know,
investment package for one K vacation packages has got to
be stolen. Yeah, dividends sharing, I don't know, proft is share.
Speaker 5 (24:25):
I don't know. They can give their friends deals on
the creative suite. It sounds like a good place.
Speaker 2 (24:31):
Yeah, all right, And number one as far as large
companies employees numbering five hundred or more with the happiest
employees in the world, Stephan Uber.
Speaker 3 (24:55):
Number one. Are you serious? I am serious? Wow.
Speaker 2 (25:00):
Now, first we have to get to the bottom of
whether they are actual employees. Are the including the independent
contractors who are drivers, or we're just talking about people
who work in the office of Uber.
Speaker 3 (25:13):
Yeah, that's that's a good point too.
Speaker 4 (25:16):
I think Uber paid to be on top of this list,
That's what I think.
Speaker 2 (25:20):
So we have two people who have been independent contractors
drivers for Uber.
Speaker 3 (25:25):
I don't know if they're counting you in this.
Speaker 2 (25:27):
Oh, they're definitely not counting me. Uber the number one
company in the world when it comes to satisfaction of
its employees Number one, Uber number one. I'm buying it,
either of mine, either mine number one. Twala, we all
picked the wrong place to work. I wonder if Uber
(25:47):
has like a podcast that they're doing they need a
host for if you don't want to make it attacked.
Speaker 5 (25:53):
Yeah, if we're not careful, we just might wind up
working for Uber.
Speaker 2 (25:57):
Okay. If I am six forty, who're alive everywhere the
iHeartRadio app.
Speaker 1 (26:00):
You're listening to later with Moe Kelly on demand from
KFI AM six forty.
Speaker 2 (26:06):
KFI mo Kelly, We're live everywhere in the iHeartRadio app
and Jess. In case you didn't know, it's Halloween time
at the Disneyland Resort in KFI AM six forty wants
to give you a chance to experience the frightful fun
The Happiest Halloween has brought fiendishly tasty truths, thrills for
one and all, and bootiful decor to both disney California
(26:30):
Adventure Park and Disneyland Park now through Halloween, which of
course is October thirty. First, just keep on listening to
KFI and later with mo Kelly for your chance to
win a four pack of one day, one park tickets
to the Disneyland Resort. And if you haven't been to
either of the parks recently, there is a case to
(26:55):
be made for either Disneyland or Disneyland California Adventure. There
is so much which has changed since the last time
I went to California Adventure but when I went most
recently a few weeks ago, it is a huge built
out park with Pixar.
Speaker 3 (27:15):
What is that they call it now?
Speaker 2 (27:17):
Is a Pixar Place, here Pixar per There is so
much there, and actually it's I would think it's more
suited for the adults. There's just you know, it's it's
you get to go to the Avengers campus. There are
just more things to do for us grown folks over
at California, Disneyland, California Adventure.
Speaker 4 (27:36):
And Pixar Piers where the bars are. Yes, so that's
there's a lot more adult beverages, right, and that California.
Speaker 2 (27:43):
Venture Disneyland there are changing that to keep up with
all the different movie iterations of all the things that
you may love. I went on Parates of the Caribbean.
They finally finished all that updating of the ride, so
it is in alignment with Johnny Depp's version of the movie.
(28:04):
They've made it if you go through it, if you're
old enough to remember what it used to be fifteen
and twenty years ago. They've removed a lot of the
quote unquote objectionable content.
Speaker 3 (28:14):
It used to be.
Speaker 2 (28:15):
They would have the pirates chasing the damsels on the ride.
Now the damsels are chasing the pirates, which is weird.
Speaker 3 (28:23):
Well, not wenches, aren't they wenches?
Speaker 2 (28:26):
No longer. No longer they've been empowered. So now the
women are chasing the pirates. And I said, that's not
how it used to be fifteen twenty years ago. But
it's a different world now, you know. It's things have changed,
but it was a great time and we recommend it
for you. So just keep on listening to KFI and
(28:48):
later with Mo Kelly for your chance to win a
four pack of one day, one park tickets to the
Disneyland resort. And I highly recommend this because let's be real.
You can go ahead and purchase him if you want,
but it's going to be expensive otherwise, so you will
want to zero in on this opportunity to look four people.
(29:10):
Dwata's gone to day them with like five people. He
still broke because of it. Yeah, byow, bring your own wench.
Now we know we cannot use such terminology anymore. It
is twenty twenty four, okay, we do not refer to
women as wenches. Well, have to come up with something else,
(29:34):
then search your vast vocabulary, you know, dig deep into
that lexicon to come up with something less pejorative.
Speaker 5 (29:44):
It would behove me to do, so I'll get right
on that. AM six forty co work for Uber. We're
live everywhere on the iHeartRadio app.
Speaker 1 (29:53):
There's a lot of misinformation out there, man, none of
it is allowed here.
Speaker 5 (30:00):
I and k OST HD two Los Angeles, Orange County
Speaker 2 (30:04):
Live everywhere on the Art Radio