All Episodes

June 21, 2023 47 mins

When Kara Swisher talks, Martha takes notes - literally. Journalist Kara Swisher has covered technology on every platform during an explosive era of innovation. As those innovations transform our lives at an exponential pace, Kara continues to call out the surrounding practical, ethical and political ramifications to listeners of her hugely successful podcasts “On With Kara Swisher” and “Pivot.” Hear about Kara’s historic interviews of tech legends, and her predictions for driverless vehicles and AI. She doesn't hold back! 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The fan base is fascinating, and I get stopped in
the street on the way here four times like Kara Hey, Well,
can't believe Sara.

Speaker 2 (00:08):
Is a unique individual. She wears all black, dark maid
and she has her aviator dark glasses eyes.

Speaker 1 (00:15):
I look like Johnny Cash on a bad day.

Speaker 2 (00:22):
Journalist Kra Swisher is the go to person for reporting
on technology. She's interviewed every heavy hitter in the industry
and for a long time, Steve Jobs, Bill Gates, Mark Zuckerberg,
Elon Musk. She's not afraid to ask them the hard questions.
New York Magazine has called her Silicon Valley's most feared

(00:42):
and well liked journalist. Caara began her professional career as
an intern at the Washington Post and has gone on
to report for many well respected media outlets. Caara also
created this seminal All Things D conference, and I love
that conference so much. Swisher and I've attended and even

(01:03):
spoke at today. Karro Swisher has become a very successful
podcast host and continues to be a force in the
tech industry. I was leaving my office today and a
bunch of my designers were sitting there saying, Oh, where
are you going and I said out loud, oh, I'm
going to interview kro Swisher, And they all wanted me
to tell you how they listen their commuters. Yeah, since

(01:24):
since the pandemic, they have become commuters from from even
up in Hudson, New York. But they listen to your religiously. Yeah.

Speaker 1 (01:32):
It's a really interesting fan base. You know, podcasting, as
you know, since you're doing it more than anything I've
ever done. Whether it's reporting or these events, you know,
they get you well known enough or just you know,
basic beat reporting. But podcasting creates an emotional connection.

Speaker 2 (01:46):
It does, and they get to know your voice.

Speaker 1 (01:48):
And your family, and you're right, they have to get
to know who you are right exactly.

Speaker 2 (01:52):
And we are at I just want to make mention
that we are at the newsstands studios in Rockefeller Center,
right in the heart of Rockefellage Center. I love it
here in an old fast studio. This is fantastic. Put
the new ideas in here.

Speaker 1 (02:05):
Yeah, this is you have to be in here, but
it feels like history.

Speaker 2 (02:08):
Well, you've been called Silicon Valley's most powerful tech journalists.
How do you feel about that name?

Speaker 1 (02:15):
You know, I don't know how much powerful I have
because the things I advocate for, like privacy and transparency
and lack of you know, hugeness of these companies has
not come to pass. They're all bigger than ever. They
take more of our information and they keep going, and
it caused a lot of damage to society. So I
think I've tried to hold back the flood, but it's
sort of flooded into our lives and changed us addiction

(02:37):
and now with AI even more so.

Speaker 2 (02:39):
Oh the AI. We're going to get to that. That's
a big topic for me, sure right now. So I
have Hey Google in my kitchen.

Speaker 1 (02:45):
Yes, I have their device.

Speaker 2 (02:46):
I have the device and it's on my counter chop.
And it's the first mistake you're supposed to have to
say Hey Google before I answer you. Yeah, I didn't
say hey Google at all. I didn't mention Google, and
it just started to talk to me. Oh well, And
I said, why are you talking to me? Uh huh
oh I don't get exactly what you're asking, right right?
And I said, well, you know you're not supposed to

(03:06):
talk to me. Do I have to turn you off
now in my kitchen? Oh? Please don't do that?

Speaker 1 (03:11):
Oh wow, Okay, so I was having a relationship with you.

Speaker 2 (03:13):
Yeah, and or something.

Speaker 1 (03:15):
It's trying to determine what you want. It wants to
be your assistant, and it wants to be your assistant,
a creepy assistant, like one of those movies.

Speaker 2 (03:22):
I just saw that one. Yeah, yeah, of the Little Girl. Yeah,
Oh my gosh, that was so Megan, right, Megan, right exactly.

Speaker 1 (03:28):
So you know it wants the idea. There's a lot
of different concepts right now, especially with AI, but it
eventually wants to be your assistant, your friend.

Speaker 2 (03:35):
Your help for Hal, doesn't it.

Speaker 1 (03:37):
Yes, that's exactly right.

Speaker 2 (03:38):
I want to go way back to Hal again.

Speaker 1 (03:40):
This is two thousand and one of Space, Say Martha's
referring too. So yeah, so that's what it wants to do.
So it's trying to anticipate what you want, what you
asked for. And human beings have patterns, whether you realize
it or not. You ask things at certain times the day,
you go places, and for years they've been following you
with the cell phone. They already know what you You know,
you first were.

Speaker 2 (03:57):
The lab Shah, Well, first you had the laptop.

Speaker 1 (03:59):
You weren't move and you're just going from website to website,
and that was fine because they couldn't really tell a
lot except what you click on. Now you have the
phone that tells you not you yes, exactly, not just
what you want, but where you're going, what you do next,
who you call, and so it has more and more
information on you as you move forward, depending on the
service US.

Speaker 2 (04:17):
So it doesn't matter that I click on do not allow?

Speaker 1 (04:21):
Well, yes and no. You know, I could show your
iPhone where it's keeping your locations. It's not sending them anywhere,
but it's keeping them, and so that's the issue.

Speaker 2 (04:31):
Reference those myself or only Google can reference.

Speaker 1 (04:35):
In this case Apple, it's in the Apple if it's
usually turned on, but most people have to realize it's
there and turned off. Now Apple doesn't send it up
to the clouds, so they don't make use of it,
not yet, but it's there. So a government entity, if
they could get into your phone, would know exactly where
Martha Stewart went all day long and how much time
you spend.

Speaker 2 (04:51):
That that's the tax man does that. Yes, that's right, Yeah,
he does that by your phone records, but also now
by location finds sure exact absolutely, yeah.

Speaker 1 (04:59):
So it's just a of how much you know. What's
happened is, and I think you and I have talked
about this for many years, is you become cheap dates
to the Internet people. You know, they make all the
money off of your information. The US government paid for
the Internet, but the people who are rich are not
in the US government. It's these people who take advantage
of it. And then they take advantage of you. You

(05:19):
yourself and what you're doing, and you willingly give it
up because you want a dating service or mapping or
whatever you happen to use. You're giving valuable information of
them so they can make money off of you.

Speaker 2 (05:31):
Well, in two thousand and three, that seems like yesterday,
although or does it seem like a long time ago
to you?

Speaker 1 (05:37):
Ah? You know, I just finished my memoir, so it
seems like yesterday now. But I forgotten more about the
Internet that I remember you being around.

Speaker 2 (05:44):
You and Walt Mosberg created the most important digital conference,
and all the conferences I've ever gone to, and I've
gone to a lot, that is the one I miss. Yeah,
because you not only had the most amazing speeds, you
also could hobnob with people, and that's also introduced a
lot of new technology at that conference. It was in California,

(06:08):
in southern California, south of Los Angeles, at one of
those big conference hotels, and tech was just really shaping shape.

Speaker 1 (06:16):
It was it was at the beginning. What was interesting
is you sitting in the front row. You always sat
in the front row. Whoever the owner of the Washington
Wall Street Journal was, you sat in the front row
of it and took notes, copious notes all the time.
So it was a little bit disconcerting when I was
doing interview, yes, because it was like Martha Steward's taking notes.

Speaker 2 (06:31):
But I was also because I was trying to take
some chairs too. I'm trying to get the names right
so that I could report on it on my on
my blog. I did you know, I still do my
blog every single day. But what were the most striking moments,
good and bad, that you had during that conference?

Speaker 1 (06:47):
Well, let me give one of you was when you
pulled out of your bag. You had this big, one
of these big expensive leather bags, and you pulled out
all these courts and you asked of Howard Stringer, if
I remember it, sir, why you needed so much any
courts and why you need chargers charging right exactly, and
you were upset because they were all different things. There
wasn't a standardized court and still there it's getting there,

(07:11):
but yeah, yeah, not yet. And so what was really
interesting was that you did that at a time and
asked it. It was a very pertinent question, which was,
why doesn't take care about the consumer? Why do I
have to carry all this stuff? Why is it a
rat's nest? Why do I have to organize it?

Speaker 2 (07:24):
Now, I know you like that, And not only that,
I was asking why do we have to buy buy that?

Speaker 1 (07:29):
So I think you asked the super pertinent question. I
love that moment because they had no answer.

Speaker 2 (07:33):
Everybody laughed, and Sir Stringer he got mad at me,
and he's you know, he said, because we want to
make money. That's why we're never going to change.

Speaker 1 (07:40):
It, right exactly, And that was the actual truth. I
think a couple of times. One was obviously the Bill
Gates and Steve Jobs interview that we did, which I
think is historical, historical, where they got together. They had
never spent any time together on stage in a real interview.

Speaker 2 (07:54):
You're so arch competitors they were, they were.

Speaker 1 (07:57):
And it was a very touching because they were sort
of like the Thomas Edison Henry Ford of the industry, right,
and so if you have an opportunity to get them.
And Steve was in good shape at that time. He
had been sick, but he had gotten better before he
got sick again. Yes, it his black T shirt and
he he and Bates have such an important copaesetic relationship
that you don't realize how important these personalities were depending

(08:19):
on it what they did. And I thought that was
an amazing interview. I thought one that was horrifying was
Travis Kalanick, who's head of Uber, only because he told
the truth. I asked him, you know, are you ever
going to make money? Because if you looked at the
numbers at Uber, it was like no money made by anyone,
so it was underwritten by venture capitalists and everyone else.
And so he said, you know what, when we get
the person out of the front seat, then we'll make money.

(08:42):
So he sort of said what Silicon Valley was pretending
it wasn't saying, which was we'd like to get rid
of human being. Want we want the driverless car and
then we can make money and not caring about the implications.
And I thought that was really instructive, and then obviously
the Mark Zuckerberg interview where he sweat, where he broke
down in a a cold sweat.

Speaker 2 (09:00):
He really did shirt soaking.

Speaker 1 (09:02):
We were in that front row again, and so was
Ryl Sanmdberg. If you remember just losing her mind? Yes, yeah,
his partner in crime. Yes, well that's a good way
of putting it. So she was upset, but he really
got nervous. And it wasn't because we were asking questions
about privacy.

Speaker 2 (09:16):
He was having a panic attack, I think. But I
sat at lunch tables during that conference. I sat with
the founders of YouTube. Yes you did, Chad her Oh
my god. I had so much fun with them and
learning all about what they were doing, having no idea
at the time how huge that was going to be.

Speaker 1 (09:34):
Yes, they were there early on. We had a lot
of you had. We had the Twitter people early on,
we had a lot of I got very.

Speaker 2 (09:39):
Involved with the Twitter people. I loved those people, and
I actually liked Twitter from day one. I didn't like
Facebook from day one.

Speaker 1 (09:46):
You did not.

Speaker 2 (09:47):
I don't know just and I still don't really use Facebook.
My company does use into. Yeah, I used Instagram much
much more, and now I'm getting involved with chiktok more
for last year or so, and it's kind of fun.

Speaker 1 (10:00):
People are getting good at it. I mean, I think
about the writer's strike right now, and I'm like, there's
a million creators on TikTok showing how creative you can
be for no money. That's a worrisome for them.

Speaker 2 (10:11):
They could have had those wallet, But those writers could
be doing issue. Maybe they are while they're trying to
on the strike. They might be doing TikTok. Who knows.

Speaker 1 (10:18):
Yeah, it's really changed. You have adopted really well to I.

Speaker 2 (10:21):
Like it, and even for an old lady, Yeah, I've
been around for a long time. I really want to
know what the kids are doing, right. I really want
to know how I can reach the vast audience.

Speaker 1 (10:35):
You know, it was interesting because your original company, and
I remember the time some people made fun of it
and I thought, no, this is exactly right when you
called it omnimedia.

Speaker 2 (10:41):
Right, I got booed on the stage at TED I
got booed, and I was taken aback because they didn't
quite understand what I was talking about.

Speaker 1 (10:49):
You were talking about media everywhere, no matter where people were.

Speaker 2 (10:52):
You know, whether where your media has to be where
the person is. Yes, yeah, and the podcast I mean,
look at look at the success of these hundreds of
thousands of podcasts.

Speaker 1 (11:03):
It's amazing, and you really reach audiences.

Speaker 2 (11:05):
The newspapers closed as magazine shutters as the printed word
goes away. Podcasts are so engaging, they are you know.
I listened to a couple of news stations and then podcasts.
Right news stations. I don't listen to books on tape.
I must rather read a book. Do you listen to
books on tape?

Speaker 1 (11:21):
I don't.

Speaker 2 (11:22):
I don't.

Speaker 1 (11:22):
I have four children, so now I don't do that.

Speaker 2 (11:24):
I don't do anything but work big podcasts and get
them food and get them food.

Speaker 1 (11:28):
Yes, that's what I do. We met at actually do
you remember you don't remember day. I'm trying to remember
the esh. You were at a party Bill Gates through.

Speaker 2 (11:36):
Oh yeah. All of us were like, what is Martha
Stewart doing here? That was we're all homemaker. What the
heck is a homemade Well.

Speaker 1 (11:42):
It was interesting because it was again, like I said,
very few people were interested in it at the time.
You were, as I said, one of them, Bob Igert,
was very curious. He called me very early when he
was at ABC, and Barry Diller called me and said this,
I need to understand this in the early night maybe
late nineties, Yeah, when I was covering it.

Speaker 2 (11:59):
But that's what I was. And I became friends with
Bill Gates. You did, and I went to his fiftieth
birthday party, which was such a blast. It was astonishing
to watch what happened, right, But I was always super
interested in people, especially from old media, that understood what
was happening and were willing to embrace it rather than
sticking to the bitter end to what they had.

Speaker 1 (12:19):
Because you had a very successful magazine and publishing.

Speaker 2 (12:22):
When we closed our magazine, I didn't close it, and
Meredith closed it, but we were at two point three
million subscribers twelve million devoted readers, and that is not
good enough to remain on the newsstand. That's correct, because
everyone's getting their information. Every one of my recipes is online,
every single article is online. You don't have to have that,

(12:44):
you don't have to have a clutter in your table.
And yet there are those diehards that really miss it.
I miss it. I miss it because I love to
write and I love to take have beautiful photographs and
beautiful designed pages, and so you have to find a
new way to do everything.

Speaker 1 (13:00):
Well, what you did has a through line in a
lot of ways, and I think a lot of the
successful media people in the Internet are genuine to themselves.
We were just a second ag was talking about Kim
Kardashi and or you is you did communicate yourself very
early in Print is Martha's Life. It's things you thought.
You were very genuine to yourself, and I think whether
people liked you or not didn't matter. They knew who

(13:21):
you were and that's the most important.

Speaker 2 (13:23):
So why did you step away from your conference with Walt?
Yeah after twenty years?

Speaker 1 (13:27):
Because twenty years seems like a good number, don't you think?
You know, I kept making it. It was kind of
like putting on I mean, you've moved through a lot
of different phases. I did it for twenty years. It
was excellent for twenty years, and I didn't see that
I could keep it up. It's like doing a Broadway
show every year. And I was super.

Speaker 2 (13:43):
Because there was not that much innovation going on in Jack.

Speaker 1 (13:46):
Because the last interview was Lorraine Powell, jobs Tim Cook
and Johnny ive No, you know, and they had everybody there,
and the year before we had everybody there. I just
felt like the genre was changing and I had done.
I kept hitting home runs, and I thought, I'm not
going to hit I'm not at some point I'm not
going to hit a home run. If I can't, I won't.
I also thought I wanted to think of new ways
for gatherings that weren't in ballrooms, weren't more formal, and

(14:08):
everyone copied us. That's honestly, honest, true, that really copied
us like crazy and made fun of us when we
started uh, and then of course copied everything we did.
And so I just felt like I had interviewed every
single person, and I thought there's other ways to reach them.
And podcasting was one of the big reasons.

Speaker 2 (14:23):
Is that, Well, your podcasts are extraordinary, not only your
your regular podcast on With, but then your succession podcast.
Yes do you like that? Oh my god, But those
characters I know are so horrible. I forced myself to watch.

Speaker 1 (14:40):
The show you did you forged. I know I had
to know all those people.

Speaker 2 (14:42):
I know every single one I know in real life.
In real life, I know who they are. Not we're
not talking about the actors listeners. We're talking about the
fan characters on which that that is base and they
do exist. It's all very well.

Speaker 1 (14:56):
Yeah, I really enjoyed that. That was interesting, and that's one
of the reasons I'm much more independent I used to be.
I'd been affiliated with the journal at the Times for
a short time. I work started at the Washington Post.
But I kind of wanted to make what I wanted
to make, and you sort of get this, like, I
don't want to ask anyone permission to what I want
to make. And the Succession one was an enormous hit
because people, because we did a journalistic job about most

(15:16):
people show those people. No, that's right, and most.

Speaker 2 (15:19):
People don't know that the families on which that that
succession is based, they really exist. And kra of course
is has covered all of them, covered all of them.
And oh the Scandinavian Oh my gosh.

Speaker 1 (15:32):
Yeah, that's Elon Elon slash Daniel X.

Speaker 2 (15:36):
Yeah.

Speaker 1 (15:36):
Yeah, he was perfect. I talked to him quite a
bit about that.

Speaker 2 (15:39):
Here was a wonderful actor.

Speaker 1 (15:41):
He did a good job. He very much distilled their
ridiculous arrogance at a dance.

Speaker 2 (15:47):
You know that I was. I dated Charles Simoni.

Speaker 1 (15:49):
Yes, I remember he used to come to the conference.

Speaker 2 (15:52):
Right. He loved going to that conference.

Speaker 1 (15:54):
He wanted to go to space, and you thought, well,
what are you doing?

Speaker 2 (15:57):
What's right? That's right? And I went up. I went
to bike and in Kazakhstan with him. He said, this
is a nerd, Hungarian nerd who then was forced out
of Hungary and when he was a child, like sixteen
years old, went to Denmark, learned Danish, accumulated Danish girlfriends.
Then he came to work for Xerox. Then he met
Bill Gates. Xerox I think steered him right to Richard

(16:19):
Bill Gates and he wrote Excel and were and somehow
I got together with him for for a long time.
But through him I met all these people.

Speaker 1 (16:29):
Yeah, that's right.

Speaker 2 (16:29):
And it was very interesting. I mean I would I
would dissect word and get him so angry because I
found every glitch and word. Why doesn't this? Why does it? Why? Then?
And he had all and and on his big boat.
He had all the software engineers working for him creating
more software, and I kept asking him, why can't you
do this? Instead of that, I would have been a

(16:51):
great tech executive. I would have I would have product products, yeah, probably,
but probably boring because those guys are so boring in
real life.

Speaker 1 (16:59):
They are indeed the quods clotures type type type, you know, but.

Speaker 2 (17:04):
Brilliant and they get it done. It's so it's so crazy.
But I then fell in love with Elon Musk not really,
I mean virtually and he I mean I love the
idea of Tesla. I mean I drive a Tesla. I
I've owned it since the beginning. I don't like paying
the twenty five hundred dollars upgrades. Elon, if you're listening,

(17:27):
these are the geniuses of today. Sure, yeah, you're one
of them.

Speaker 1 (17:30):
Oh I don't think I'm as No, but you're a
genius and you're in my own way. But you know,
and Elon's taking a turn that's a little strange.

Speaker 2 (17:36):
Well, I'm not happy about it because I did sell
my Twitter at fifty four dollars and seventy five cents.
That was I waited, I did. I I've thought it,
I've fought it out. You know, should I sell it?
Should I not?

Speaker 1 (17:47):
Something?

Speaker 2 (17:48):
But it went down. It went down to like thirty
or twenty something. He doesn't belong with Twitter.

Speaker 1 (17:52):
No, well, he needs it for his it's sort of
his id right, And so you know, Tesla Elon is brilliant,
SpaceX Elon is even more real Twitter Elon is a jackass,
and that's really the problem.

Speaker 2 (18:03):
I agree, who's your favorite job, guys?

Speaker 1 (18:14):
I think jobs? I wrote it too, because he you know,
everyone has his take on him that he was mean
or he was difficult. You know, he wasn't. He had
too much passion. I guess that's it caroused him to be.
Everything was hard with him, and I think people he's heartless.
I'm like, no, no, he has too much heart.

Speaker 2 (18:29):
First place, he did.

Speaker 1 (18:30):
He also had a civic sense of things. Now he
made lots of mistakes. Look, he parked in handicapped spots.
That's the worst like at this point compared to everybody else.
You know, he didn't tweet anti Semitic things, he didn't
tweet anti homophobic stuff. He really did create and bring
vision to things, and that's been sorely lacking. Like the idea,
who is the next Steve Jobs? It sort of was Elon,

(18:52):
but now not so much. Well, maybe he'll come back.

Speaker 2 (18:55):
He has to. He has to because he's way too
smart and all his ideas really could work.

Speaker 1 (19:01):
A lot of them. I mean one of the problems
is when you become this rich and maybe you've experienced this.
When you become rich or famous, people surround you that
are enablers and they make you. And this happens to
tech people all the time, and they get pulled in
like Gates with Epstein, like all these people, and so
your judgment gets and actually succession if you think about it.
One of the things they did great and that really
well in that show was that their environments got smaller

(19:23):
and smaller and smaller. They went from car, expensive car,
to expensive playing to expensive apartment, but never talk to
regular people, and everyone around them work for them, and
so they're never going to be honest. And I think
that's a lot what's happened to tech people in a
lot of ways. And so you know, when everyone's agreeing
with you, you unless you have a fall like Steve did,

(19:44):
like you did, you don't understand real life, right, You
don't understand that, and so you live in a bubble
that creates a real problem for you.

Speaker 2 (19:52):
But you extolled Elon's talents early on. I did for
a long time. But now you're not on speech. But
now I'm speaking to what happened.

Speaker 1 (20:01):
Well I don't really appreciate someone. When Paul Pelosi gets
attacked to tweet something homophobic, I think this poor man
got beaten up by a crazy person who'd been in
many ways whom I know very well, to raise anti
gay tropes at that time, the anti Semitic stuff is disturbing.

Speaker 2 (20:20):
The drunk or something doing that.

Speaker 1 (20:24):
I would say so, but it's been about one hundred
and three of them now, and so you're sort of like,
you are so much better than this. What are you doing?
And so I actually tweeted something where I agreed with him.
For some reason, he decided it was a disagreement. And
because he's so thin skinned, because so many people around
him are being paid by him to tell him people
are bad, he can't tolerate dissent. And that's the problem.

(20:45):
Nobody tells him what are you doing? And I think
when he was at his best, he listens to people saying,
oh yeah, I hadn't thought about that. Elon should see
me as a friend of his who's a real friend, right,
rather than someone.

Speaker 2 (20:57):
Sit down with you and have lunch and talk about it.

Speaker 1 (20:59):
He can't now, he's he's there's a lot, you know,
there's a what about his new hire for Twitter ads.
I like her very much. You talked to her, I
have talked to her. I know her very well from NBC.
I think she's very talented. I think she's gonna have
a hard time because she has to sell ads.

Speaker 2 (21:14):
And decline and as even after she joined.

Speaker 1 (21:18):
Yes, that's correct, and you know they just removed all
the safety people from it. You can't sell ads when
you have child pornography on a site, right, I'm sorry.
It doesn't go together. So she has good will of
the industry, and I'm hoping that she will have some
else effect on him, But I don't think so. He
doesn't want to. He wants to know if it makes money,
not that it's the right thing to do, and so

(21:39):
I think that's going to and he's losing so much
money there it's crazy.

Speaker 2 (21:42):
So and Jeff, Jeff Bezos bought the Washington.

Speaker 1 (21:44):
Post, your old newspaper a long time ago.

Speaker 2 (21:46):
What do you think about that.

Speaker 1 (21:48):
I think he's been a good owner. I think he's
you know, I think he's been a very good owner.
He just there was just a change today with Patty
stone Side for taking over for Fred Ryan. I think
that was much needed there's a lot of discomfort with
Fred at this point, and it's a great institution. It
just needs more investment and more innovation. I think it
had stopped being innovative.

Speaker 2 (22:06):
I wanted to get back to your two podcasts on
with Parrot Swisher and pivot. Yes, you know there are
two words in the vocabulary that I absolutely abhor and
do not let people say pivot. Pivot and and rigor rigor.
Why don't you like rigor? I hate I don't like
the word moist.

Speaker 1 (22:23):
Oh, I don't like the wordist either, but I don't
like the word ointment either. Wet nurse is also bad.
I have a whole list of them, have a whole sense.
Should I have a sentence? Excuse me, wet nurse, please
apply the ointment to the moist penis. That's my least favorite.

Speaker 2 (22:43):
And don't pivot when you see what it looks like.

Speaker 1 (22:45):
It's a joke that we're making a joke based on
the idea. Yes, because everybody everyone in tech, when they
have a failure, which is hysterically pivot. We're pivoting. We
didn't mean to lose all this money, but now we're
going to pivot to blink.

Speaker 2 (22:58):
I hate that excuse. It works.

Speaker 1 (23:00):
Twitter pivoted from a podcasting company to Twitter. Now it's
still a shitty business, but that's slack was a game
company and a pivoted. That's great, that's a really good thing.
But most of the time it's an excuse for failure.

Speaker 2 (23:11):
It's crazy.

Speaker 1 (23:13):
So that's the joke. It's like, we're not really it's
a joke meant to pivot. But it's worked out rather
well because I'm on with the most defensive white guy
in America, which is Scott Galloway, which.

Speaker 2 (23:23):
You've worked across all media platforms. How important is that
you have complete ownership of your content completely on Fisher
So do you own it? Yes?

Speaker 1 (23:32):
I own ONWI then and how.

Speaker 2 (23:34):
Do you access it? The listener?

Speaker 1 (23:36):
Well, I have a partnership with Vox Media. There's very
few media companies that let you own your things, and
Vox is the almost the only one. It was at
the times. But they just pay you a salary and
then they own everything you do. And what got me
going is all my my code interviews and my all
Things Digital interviews are owned by Rupert Murdoch. They lost
them all on they did a changeover thing. All my

(23:59):
content is gone. I don't know where it is. It
has to be somewhere, so it's somewhere, but I can't
find it. It's not accessible because they did a changeover
of their systems.

Speaker 2 (24:07):
And those interviews would be so valuable, that's correct. Some
of them dream correct of coming up with a new idea,
so they.

Speaker 1 (24:13):
Didn't care for my creations, right, they didn't. It's like
being a painter and you paint something beautiful and then
some dumb rich owner owns it right and puts puts
his foot through.

Speaker 2 (24:21):
We have the digital of all our content. I mean,
it's been very It's been kind of difficult because the
platforms change and you have to keep upgrading. But we
have access to pretty I think.

Speaker 1 (24:34):
That you are everything that living owns it owns it, right,
and so one of the things that's important to me,
which is the last thing I did, was to own
it completely. Because Rubert Murdock owns half my interviews, New
York Times owns a whole bunch, and Box owned a
lot because I did my first podcast for them, and
I thought that's enough. I'm sick of making It's like
being a chef for a rich person, right, and now

(24:55):
you own your own I own every good okay, and
and they rented from you, essentially sort of licensed for you.

Speaker 2 (25:02):
Yeah, yeah, oh good, it's a reverse license. That's right.

Speaker 1 (25:04):
It's important you were one of the first to own
your stuff.

Speaker 2 (25:06):
Oh yes. And content. I think my strength was that
I really believed in the future of good, evergreen content.
That's right, right, And so an interview with Steve Jobs
is an evergreen piece of content that you have to find.
That's right.

Speaker 1 (25:20):
Well, we put that made that when he died. We
went to Rupert Murdoch and said, can we make this
free to the entire world all his We did six interviews,
including the one with Gates, and we put it on
Apple for free to the entire world so that they
don't have the copyright anymore. And he agreed at the
time because he was a big fan of Steve Jobs
and was very sad at his death. And so what
we did is we asked him to do it, and

(25:41):
at the time his greed escaped him for a second
and he allowed us to do it.

Speaker 2 (25:46):
So, oh my gosh, that.

Speaker 1 (25:47):
Piece of content is free.

Speaker 2 (25:49):
So is podcasting? Is this the new frontier for for knowledge?

Speaker 1 (25:54):
Some of it, some of it? Yeah, I think you know,
there's all these new media companies opening every day. I'm
very hard. I started a new media company way back
in the two thousands, I remember, and that was unusual.
And now there's substacks, there's all kinds of ways for
journalists to make money. Not all of them are going
to succeed. There's too many of them, and it depends
on if you're good and if you have a product
that just like anything else. Do you have a product

(26:16):
and you've had this experience in Target or wherever you
were selling your stuff, if the product is that's right,
oh right, right, Kmar. If you have a product that
people want to buy, people will buy it, and that's
what you have to do. Now there's too many podcasts,
but that doesn't mean a bunch of them aren't going
to make money. It means some of them aren't, just
like in media, and that's fine, that's totally fair. I think.
So I we've been done. We've done great financially and

(26:38):
as a business, but not everyone's going to You know,
the fan base is fascinating and I get stopped in
the street on the way here four times, like Kara, Hey, well,
I can't believe.

Speaker 2 (26:49):
Sarah is a unique individual. She wears all black darkness
and she has her aviator dark glasses.

Speaker 1 (26:55):
Yes, I look like Johnny Cash on it.

Speaker 2 (26:57):
But you and Anna Winter are the only two women
and I know that are identified by your eyewear.

Speaker 1 (27:03):
Yes, that's right.

Speaker 2 (27:04):
Yes, And it used to be Sophie Lren.

Speaker 1 (27:06):
Oh well no, and we're not. Neither of us is
quite up to her. But nonetheless it's really interesting because
you have an intimate relationship. Podcasting is different. For years,
I was a very well known tech reporter, and you know,
tech people would stop me, Oh, Cara, Hi, how you doing.
Let me tell you about this what happens. When I
started the podcast, the first one, which is called Recode Decode,

(27:26):
I was in the subway in San Francisco and four
African American women, young women came up to me and said, Caro,
we love you. Can we take a selfie? And I
was like, this is not my demo, Like it's usually
a white guy and the older white guy, right, And
they're like, can we tell you? And we love this?
And we didn't like this and we liked this. We
hated that guy. They were entrepreneurs. They were doing a
makeup line for women of color, and they were entrepreneurs.

(27:48):
They weren't tech entrepreneurs, but they were using tech like
anybody else. And I was like, do you read recode
the website? They're like, what's that? They only listened to
the podcast and I thought, oh, I see they know
me because they hear my voice. I'm in their heads.
And it was a great way of developing relationships.

Speaker 2 (28:06):
They look at your picture and they put you together.

Speaker 1 (28:08):
I get stopped by at one time in here in
New York. I was I was walking down the street.
Someone made a three sixty turn. He jumps out of
his giant car. He's an ober driver. He's also a
fireman from Queens, which is another demo that was not
my demo, right, and he goes.

Speaker 2 (28:23):
Caro swish, I love you.

Speaker 1 (28:24):
I love when you kick Scott Galloway in the nuts.

Speaker 2 (28:26):
I love it.

Speaker 1 (28:27):
It's fantastic. I didn't know about AI whatever, and I
was like nervous, and he goes, can I have a selfie?
And I said sure. He goes, did I scare you?

Speaker 2 (28:34):
I said yes.

Speaker 1 (28:34):
I thought I was in taken from leam Neeson. I
thought that was the end of me. The either the
Saudi's or Elon was decided it was time for Kara
to leave them more planet and it was great though.
That's the kind of thing you get stopped all the time.

Speaker 2 (28:46):
So who remains on your on your jew list? Yeah,
your do list for the podcast.

Speaker 1 (28:51):
I'd liked interview Donald Trump. I'd like a shot at him.
I certainly would like to talk to Jeff Bezos again.
I think he'd be I have a dozen times.

Speaker 2 (28:59):
Where is you know? He hasn't been VISI hasn't. So
you're married again? You got y, you were married, then
you got divorced, and then I got married. Yeah, younger, older.

Speaker 1 (29:08):
Slightly younger, not to not too much. I didn't go
way down.

Speaker 2 (29:10):
So yeah, you're just what how old are you? I'm
sixty sixty? Oh gosh, that's I knew, Cara. How many
years ago? Of thirty thirty years ago? I knew you
were thirty, that's you did. Yeah, you had two college
students and two young kids and two little kids. Yeah,
how old are the babies?

Speaker 1 (29:28):
And the babies are three and a half and one
and a half. Oh, and are they showing you new
things completely? It's my first girl too, and you have
a daughter, and that's just amazing. I have three sons
and a daughter. And so my older kids are great too.
They're just and they're wonderful. Brothers are no. No, one's
a chef, one's a cook, and the other will be
a teche Yes, he will be.

Speaker 2 (29:48):
Yes.

Speaker 1 (29:48):
He's interested in neuroscience and physics.

Speaker 2 (29:50):
So that's very chechy, right, yeah too. Yeah.

Speaker 1 (29:53):
And my daughter said she's going to be a doctor
and a mom, so a great. Yeah, she's already picked.
And the baby can't talk yet, baby does. Baby's going
to break things. That's he's going to destroy things. Construction
and destruction.

Speaker 2 (30:12):
What's plaguing tech? Now? What are the besides Washington?

Speaker 1 (30:16):
Washington isn't plaguing them. Washington has not done any regulation
Washington talks about.

Speaker 2 (30:20):
Yeah, but there's I mean, all this stuff with chik chok.
What do you think of that?

Speaker 1 (30:24):
I think it's bright, shiny. I just interviewed Jenn Easterly,
who runs SISSO, which is the agency dealing with cybersecurity
attacks on everything from elections to the electrical grid. And
I think TikTok's a shiny object. I think the real
issues are China really upgrading itself, around AI, around autonomous cars,
around the government funding a lot of things going around

(30:46):
the world and collecting precious minerals you need for all
for batteries and for phones.

Speaker 2 (30:51):
And I think owning owning minds every everywhere uranium I
just went to I just went to Madagascar. There are
minds there being you by the Chinese. Yes, Greenland has
so many minerals being being mined by Elon Musk and
Jeff Bezos all for their space exploration. That's corella is
their batteries.

Speaker 1 (31:10):
Well, they need that, but the batteries are essentially owned
by China right now. Essentially, I mean a seventy percent
some numbers. That's really high. So I think we spent
a lot of time. I think it's listen. I wrote
a column four years ago in The New York Times
saying I love TikTok, best new product I've ever seen.
It's going to be a huge hit. It's run by
the Chinese Communist Party. Like, that's my worry, so I

(31:30):
use it on a burner phone. People lost their minds
when I wrote that. They're like, how dare you anti Chinese?
I'm like, I'm anti Chinese Communist Party. Yes, I will
cop to that. I'm not anti Chinese and so I
think people, you know, the idea of propaganda surveillance is
really problematic. So I think it's a worthy thing to
think about, especially since we're not allowed to be in China,
None of our tech companies are allowed to be there,

(31:51):
but they're allowed to have one hundred and fifty million
users in the US. Oh yeah, all kinds of problems.
And I've interviewed all the senators about this. It's very
clear that they're using for propaganda and surveillance. But we
have bigger issues of fish to fry, technical fish to fry,
like China AI around AI autonomy, batteries, all kinds of
innovation issues, and they are they want to become the

(32:12):
dominant technological power, and we are US continues to be,
especially in AI chips. Obviously we're going to start making
a lot more here.

Speaker 2 (32:21):
Do we have to we just send it all to China?

Speaker 1 (32:23):
We do because if we're lazy and don't figure it
out correctly. And so one of the things that's important
is to think about where the next generation of technology
is going. And what we have to do is including
our government, which has sort of opted itself out and
let you know, space is now run by Elon Musk
and Jeff Bezos essentially Elon Musk. Yeah, you know AI
is now run by the top tech companies. That shouldn't
be the case. The government should be deeply involved in

(32:45):
this stuff.

Speaker 2 (32:45):
Yeah, AI is probably you consider that the most important
next step in tech.

Speaker 1 (32:50):
Well, it's been around forever. Let's yes, machine learning, neural networks,
this is forty years old. What's happened is there's been
a quantum leap in its abilities, which they didn't think
was coming for another decade, right, some people did, some
people didn't.

Speaker 2 (33:02):
Then who spearheaded that leap chat.

Speaker 1 (33:04):
GPT, which was funded by Elon and also Sam Altman,
who's the more principal person there, and a bunch of
people read Hoffman, a whole bunch of people, you know.
And so it's you know, right now it's being decided.
All are critical things again are being decided by private
companies and not elected officials. Even if you ate your
elected officials, they're elected. And do you know any electric

(33:26):
officials who could possibly be well listen, elected officials do
the SEC elected officials do regulators do? The FCC, they do,
the FDA, they do not perfectly well.

Speaker 2 (33:37):
Many of the many of the elected officials in the
news today are really concerned with their on tiktoky with
issues that shouldn't even.

Speaker 1 (33:46):
Know trans people. Oh god, really, this is our biggest
issue of our day. No, it's not. And so I
think one of there are like Senator just Interview, Senator Warner,
Senator Bennett, Senator Klobasar ken Buck, Who's I agree on
nothing with, but I think he's excellent legislator. He's from Colorado.
He's very conservative, but very smart. He understands what's at stake.

(34:07):
There's bunches of them that do. There's bunches in the
dew and Lena Khan who runs the FTC. It's just
a question of our government getting to Europe is the
one legislating everything now, not US, and we've never legislated this,
so they need to enter the picture in it.

Speaker 2 (34:21):
They're getting some big settlements from our chech guys.

Speaker 1 (34:24):
Some of them they're not that big. Well, I think
it's pocket change to them. We need a global initiative
around as global.

Speaker 2 (34:30):
I mean, it's probaly a change to Microsoft into Apple into.

Speaker 1 (34:34):
But we need a globe like we should all agree
and the people. One thing that's different from the AI
people to the social media people. Is they're warning. They're saying,
this is dangerous. We need your guidance. They're asking for guidance. Now,
that could be a virtue signaling. That could be something
they're doing just to like trick everybody. But they're saying
it's problematic and so so an existential they call existential threat.

(34:55):
If the people making it are calling it an exisential
threat on par of pandemics and new clear war, maybe we.

Speaker 2 (35:01):
Should act, paying attention.

Speaker 1 (35:03):
Pay attention, and there should be global initiatives like on
like no killer robots. Nobody gets to make killer robots.
That's all decided, just like with nuclear issues, everyone doesn't
get to do it. Maybe we should decide the algorithms
should be transparent so everyone can see them and the
impact they have on people. Maybe everybody should see what
you can't build, Like you don't let anybody have stuff
about building another pandemic, don't let any flag people that

(35:27):
kind of stuff. We can all agree on certain things.
So does there have to be a worldwide conference on this? Yes,
the world?

Speaker 2 (35:34):
Is there one planned?

Speaker 1 (35:35):
No. I've talked to a secretary blinking about it. I've
tried to push I think so I should.

Speaker 2 (35:41):
I should talk about that.

Speaker 1 (35:42):
You know, it would be interesting because some people think
there should be an agency that that regulates the world.

Speaker 2 (35:47):
Agency like a world or a national like the United Nations.

Speaker 1 (35:50):
Yeah, or national.

Speaker 2 (35:51):
Maybe maybe an agency within the United Nations. Well, but
that's a problem that's so effective.

Speaker 1 (35:56):
I know, nations. You know, no one can agree on
lunch anything, right, No, but you could have an organization
just like look the internet, big goold.

Speaker 2 (36:05):
I could I like that. Yeah.

Speaker 1 (36:06):
I have a couple ideas, Mark, I have another one
where I want to get This is my greatest idea.
I want to get Lorraine Powell Jobs, who has all
the money in the world, Melinda Gates, Mackenzie Scott's the
ex wife.

Speaker 2 (36:19):
All these are all the ex wives or form wives
widows of of of Steve Jobs, of Jeff Bezos Gates,
of Bill Gates.

Speaker 1 (36:28):
And maybe Cheryl Sandberg, I don't know.

Speaker 3 (36:30):
Just get them all, oh and the wood Skis, the
Susan Wijeks, stuff like that, and get them together in
a in a group of investors and philanthropists together, these
women who have so much cloud, they're all doing interesting things.

Speaker 1 (36:42):
Lorraine owns the parts of the Atlantic McKenzie's been killing
it in philanthropy, doing amazing things. Wis are doing philanthropy obviously,
Melinda Gates has Pivotal, which is an investor and also
does philanthropy and through the Gates Foundation. Like when I
get them together on three topics, like all these power women,
you could join it, Martha.

Speaker 2 (37:01):
I would love to join it. But this is about
this is about AI.

Speaker 1 (37:06):
Do that panel, yes exactly, and not to day, but everything,
like what if they got together and really used their
money and power to influence I don't know, immigration, gun control, AI,
something like that.

Speaker 2 (37:18):
The world is flat, we have to admit it, and
we have to do That's that book. Yeah, all of that. Yeah,
so so hard to think about this every single day.
Do you wake up fearful in the morning about what
the news might what might have happened the night.

Speaker 1 (37:31):
Now? You know, it's interesting because I got into a
beef with JD. Vans, who used to be intelligent. He
used to be a tech He was on the board
I was on. I know he was a tech person.
Oh yeah, something happened someone someone grabbed his brain and
did something to it. And he was tweeting at me
something obnoxious and then he wrote to me, he goes,
liberals don't believe in the future, and I tied it back.

(37:52):
And I shouldn't have done this, but I did, And
I said, you know what, I have double the children
you do, So I believe in the future twice as
much as you do. Obviously, you better get to it
because I kind of winning the children game. So if
you have children, you have to believe in the future.

Speaker 2 (38:03):
Oh, you have to. And you have to embrace the
future and help and try to make it better.

Speaker 1 (38:07):
That's correct.

Speaker 2 (38:07):
I totally agree. Now I wake up worried that something
bad happened last night, I mean Ukraine. Yeah, I know,
somebody will call me if something bad happened, somebody from
Europe will call me in tell.

Speaker 1 (38:18):
That's because the Internet gives this constant, instant information all
the time. This was happening before, we just didn't know
it right, and so it's created. The society's on a
full twitch. And then they had no guardrails, so they
get to spew endless amounts of toxic waste in our face,
some of which we don't know is true.

Speaker 2 (38:35):
And AI, it'll get a lot worse. I'll tell you that.
How often do you use chat Cheep each.

Speaker 1 (38:40):
I use them all. I'm right now using PAI. It's
by the guy who started Deep Mind and read Hoffmann.
It's going to be a personal assistant. It's going to
know you, and it's going to talk to other AIS.
There's going to be dozens of AIS and they'll negotiate
with each other on your behalf your AI.

Speaker 2 (38:55):
So what are you using it for?

Speaker 1 (38:57):
I'm just trying it out to see what works. It's
in the earlier. In this case, personal AI is different.
It's going to be your assistant. Think of it more
as your assistance.

Speaker 2 (39:05):
Tell you something for it.

Speaker 1 (39:06):
I don't you know, Scott uses a lot more than
I do. I test it out. It's often wrong. It's
often like right now, Google has an experiment where it'll
write your email for you. You write a couple of sentences,
then it'll improve it, and it never.

Speaker 2 (39:17):
Improves it, No, but it will ultimately when it gets
to you, it will gets your v OPAC.

Speaker 1 (39:22):
You remember the early Internet how it looks, Remember Yahoo?
Remember that page that doesn't look so good?

Speaker 2 (39:27):
Right?

Speaker 1 (39:28):
And it'll get better. And so I think we have
to figure out what the good parts of AI are.
And there's tons of them. With healthcare, with information, with ideas.
My brother's an anesthesiologist and he uses it to decide
on cases, and you know he decides in the end,
but it gives them ten good ideas.

Speaker 2 (39:44):
One of my nephews uses it to write job descriptions.

Speaker 1 (39:47):
Right, why not those are and they tend to work.

Speaker 2 (39:50):
Okay, Yeah, he says they're very good.

Speaker 1 (39:51):
Gives you idea. Generation is one of the greatest things
from me. So what can we do that will help humanity? Healthcare?

Speaker 2 (39:57):
Education?

Speaker 1 (39:57):
It's a tutor in your pocket, like a really good
if it's done well?

Speaker 2 (40:01):
Right? Is there any if I have, like a problem,
I have this red spot on my arm? Why am
I getting this red spot? Is it? Is it plant?
Is it? In general? Yeah? I can't find that right.

Speaker 1 (40:13):
Well, eventually you'll take a picture and it will tell you.

Speaker 2 (40:15):
So you've been vocal about the ethics of social media
and privacy, very vocal about all of that. Yes, absolutely,
And how do we how do we control that?

Speaker 1 (40:24):
We have legislators that could have made a privacy bill
ten years ago and they got bought off by the
tech industry. The tech industry has not like one point,
Amy Klobuchar, I had one hundred and ten million dollars fighting.
She was trying to get antitrust and a privacy bill gone,
gone gone because they pressured a lot of senators in
the midterms that should have passed last session. We don't

(40:44):
have a national privacy bill. It's crazy.

Speaker 2 (40:46):
What do you do with your little kids in terms
of technology? Or I just watch My.

Speaker 1 (40:50):
Daughter watches Frozen on continual loop. It was funny. I
was texting with Bob Byger and he goes, I go,
I can't stand Frozen. I really, you have to stop.
And he goes that Frozen three is coming soon, and
I was.

Speaker 2 (41:00):
Like, my grandchildren were not allowed to watch Frozen one
two and now really no, no, my daughter their first
movie was Castaway.

Speaker 1 (41:10):
Oh well that's a good movie. That is, there's something
with Moana and Frozen. Disney is an evil, evil company.
I love Disney.

Speaker 2 (41:20):
Yeah, I love it.

Speaker 1 (41:21):
Yeah, but there's the CEO who's really standing up for
his beliefs.

Speaker 2 (41:26):
So glod he came back.

Speaker 1 (41:27):
Yeah, I love, I'm very I love, but I do.
I think he's really an honorable CEO.

Speaker 2 (41:33):
He is and and cares about the future and cares
about entertainment. That is, it's healthy and he stood up, totis,
what did you think about that open letter to Sam Altman?
And the other there were too.

Speaker 1 (41:45):
There was the first one. The first one was a
lot of people who are making it or worried about
and thought there should be a stoppage. That's not going
to happen. No, that's not happening. It's very funny to
talk about, but it's not. Their future doesn't stop. And
then the people like him signed another one saying the
government needs to get involved. And I think that's probably
the appropriate answer is the government.

Speaker 2 (42:05):
Needs to get so we need it. We need a
commission within the government that's good industry.

Speaker 1 (42:09):
Government, citizens that are talking about the implications and so
we can all weigh in. Citizens are much smarter than
you think about these things. They know what's happening, and
so what they need to be do is not talked
down to, and they understand the implications. I need to
understand the positives and the negatives and so they can
make decisions.

Speaker 2 (42:25):
So what's the story about the cars? Ok I just
wrote in a bunch different ones the way we did
that and the Aurora Aurora Yeah yeah, now those are
driverless driverless Okay, what do you think I loved it.

Speaker 1 (42:38):
I thought I was surprised by I have been in
every riot all around San Francisco. It did on the streets.
On the street, okay, there's some glitches. It stopped sometimes
in intersection, it doesn't. It tends not to get in
an accident as humans tend to hit it. But there
every now and then there's one or two accidents. Tesla
has many more of them. I think Weimo is really

(42:59):
I was so impressed about.

Speaker 2 (43:00):
What about Rivian.

Speaker 1 (43:02):
That's an electric I have written, and it's a very
good truck. There's it's such a good you know, that's
why Tesla's in trouble because there's so many more competitors.
Now he's got the he's way ahead on lots of things.
But eventually, as in all things, people people copied you,
people copied me. Sure, that's what happens. And so I
loved the Waimo ride. I have to say, and I

(43:22):
know it has some problems, but boy is it improved
since I first started riding it to now it's astonishing.

Speaker 2 (43:29):
And those high speed trains and tunnels that are going
to take you to Boston, and why.

Speaker 1 (43:34):
Are you wasting your time on Amtrak, right, why why
it's a waste of time.

Speaker 2 (43:38):
They can do it. They have the chick knowledge.

Speaker 1 (43:39):
Even got mad at me for those pieces about autonomous cars.
I'm like, are you enjoying turning your butter still? Like
like come on, like, obviously humans are the problems with cars.
It's not the machines, it's humans.

Speaker 2 (43:53):
So see, I go back to the old movies, like
like Blade Runner. Remember Blade Runners. That's a dystopian view.
It's too great.

Speaker 1 (44:01):
It's grim.

Speaker 2 (44:02):
Yeah, it's very grim. But when we had the fires
from Canada, New York looked like blade it did. And
then the rain came and the and the it was yellow. End.

Speaker 1 (44:11):
We're used to that in California.

Speaker 2 (44:13):
No, I bet you are. And oh it was just
if there's.

Speaker 1 (44:16):
Not big chunks you're putting in your mouth, then you're like,
ahh whatever, You're like, not big chunks, I'm fine. But
it was disturbing in every level.

Speaker 2 (44:25):
But and then there's convoy, you know, convoy shoo, the yes,
the truck, the driver list truck should do all the
driving up. Well, that's what AA is doing. When is
that going to happen?

Speaker 1 (44:35):
It's happening. It's they're moving from Dallas to Houston. They're
moving stuff at night. They're fantastic. That's another job.

Speaker 2 (44:42):
I know.

Speaker 1 (44:43):
People are like, oh, it's going to kill jobs. There's
not enough drivers in this country. We have a we
have a deficity, we have as and it shouldn't be
done by people. I'm sorry to say that, but eighteen
hours a night with a person, give me a break.
I shouldn't. That's not a job people should do, if
it could be done in a safer way for everybody.

Speaker 2 (45:00):
Topic because you're so up to the minute on everything
is the heads.

Speaker 1 (45:03):
I'm going to see it tomorrow?

Speaker 2 (45:04):
Which one the new Apple?

Speaker 1 (45:05):
New Apple? One morning?

Speaker 2 (45:07):
Oh gosh, nine am I'm who's showing it to you?

Speaker 1 (45:11):
Probably? Uh want Jaws? Maybe?

Speaker 2 (45:13):
Oh yeah, yeah, that's the thirty five hundred dollars one.

Speaker 1 (45:17):
The thirty yes?

Speaker 2 (45:18):
Eight or five? Yeah? Five?

Speaker 1 (45:20):
Are you going to get one?

Speaker 2 (45:20):
Martha? Well, I tried them at Cees. I tried several
of them. That one wasn't there, that's not there.

Speaker 1 (45:26):
This just got in uced.

Speaker 2 (45:28):
But they're disrienting.

Speaker 1 (45:30):
I think Apple. I'm going with Apple.

Speaker 2 (45:32):
Yeah. Well it looks it looks better, it looks more comfortable.
I had one on that made me so easy, that's correct.

Speaker 1 (45:38):
There's still that issue with spatial it's called it's called
spatial computing. Yes, I call it a face computer essentially,
but I'm really interested in where it goes for for
for meetings.

Speaker 2 (45:48):
I remember Meta Meta Facebook had early on, Remember that one.

Speaker 1 (45:52):
This has both, This has AR and V Yes, yes, Meta.

Speaker 2 (45:55):
Is just the original one which VR.

Speaker 1 (45:58):
Yes, that's fine, they're perfectly. I just think eventually we
will have something on our face that will give us
a ton of information and recognizing people. I've been interested
since Google Glass, since and early on. Directionally it's absolutely correct.
Executionally it's been bad. I mean I just found my
Google glass the other day.

Speaker 2 (46:15):
I have one someplace.

Speaker 1 (46:16):
You do keep it, sell it, sell it for you know,
keep it as a as a collector's item. Well I'm
sure they gave it to you, right yea early on.

Speaker 2 (46:24):
Yeah. But all of those things, it's just like, have
you have you saved all your them? You do everything?

Speaker 1 (46:30):
Every every Apple, every Apple computer. No, I just have
them boxes and stuff. I'll leave to my kids how
they can sell them. They would buy something. I don't care,
but I have, you know, the original, all the original stuff,
and it's really interesting. The palm pilot, pilot with a
mirror on the back for the ladies, and that was

(46:51):
something that guy said that to me on stage. I
don't know if you remember that the mirrors for the ladies.
Are you kidding me? Sexist?

Speaker 2 (46:58):
Good for the men to look, I.

Speaker 1 (47:00):
Know, because women are the most narcissistic people on the planet.
Let's well, we could go on hours and hours, and
you and I are going to talk tomorrow.

Speaker 2 (47:07):
Well, yeah, I'm going to be on Cara's.

Speaker 1 (47:10):
Talking about and what she's learned.

Speaker 2 (47:14):
Well, thank you, but this is fascinating your your knowledge
is so extensive in the world. You have it all there,
and I really admire what you've done.
Advertise With Us

Popular Podcasts

Dateline NBC
The Nikki Glaser Podcast

The Nikki Glaser Podcast

Every week comedian and infamous roaster Nikki Glaser provides a fun, fast-paced, and brutally honest look into current pop-culture and her own personal life.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.