Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Hey guys, it's Sammy J. And welcome to this week's
episode of Let's Be Real. This week's episode is with
journalists and tech reporter Lorie Siegel. You guys, I have
always been so fascinated in technology and how social media
is affecting all of us, and I think it's really
affecting all of us this year, especially as we are quarantined.
(00:23):
Lorie Siegel has been an incredible journalist and she has
a really interesting perspective because she has been on the
ground of technology. When Instagram, Facebook, Twitter, we're just startups.
She has been interviewing these founders for years and she
has an inside perspective that no one else has. I
hope you enjoyed the episode and I cannot wait for
your feedback. Hi, Lorie, I am so excited to have
(00:49):
you on my podcast. I am fascinated by what you
do because you have been on the ground since Instagram, Facebook,
Twitter they were just startups. What's it like seeing the
progression of these artups turned into billion dollar companies? I mean,
by the way, when you say that, I'm like that,
I feel so old recent though, like this is so new.
(01:10):
I know, I think people forget like that. It was.
I mean, it's funny. I'm writing a book right now
and I'm talking about my my first interactions with the
folks from Instagram. And by the way, they're my age.
Like I'm first of all, I know you're eighteen, so
you're young, but I'm thirty five years old. Like I'm
not that old, right, but you're not at all, right.
But like at the time when I started covering technology,
(01:30):
it was just like a bunch of us and we
were just all really young. And I remember interviewing Um
Kevin Sistram, who had this app called Instagram, and I
remember us being out of the West Side Highway and
it was just me and our camera guy and me.
You know, I was still wearing the same black blazer
that I wore on every single shoot because I wasn't
sure exactly had addressed yet for for television. And I
(01:53):
remember being, you know, um, super interested in this idea
of an app, and I and and being like, well,
why would people want to use this photo sharing app?
And you know, there were four people at the company
at the time. It was before they sold the Facebook, um,
and and it was when I would still ask these
founders and I remember doing this um, like the founder,
(02:15):
one of the founders of Twitter, why would people use Twitter?
I remember saying this to the founder of Uber, Travis Kalnik,
Why would people get into a stranger's car? Because you
forget all of these things. It was really interesting. It was,
you know, interviewing these founders when before they turned into
the multimillionaires and the billionaires, and when they just felt, um,
(02:35):
very much like kind of your peers and people who
didn't really play by the rules and and had these
ideas of things that could change the world. And it
was really optimistic time, and very much in two thousand
and eight, two thous and nine when I started covering technology,
when I got into journalism as a young journalist, like,
I was just attracted to weirdos and it was just
a bunch of weirdos, um, And so that's how I
(02:57):
got into it. When you talk to them now, is
it your friend then when you used to do still
have that peer like similarity or do they act like
the CEOs and feel as though that they are above you? Right,
that's a good question. Um, Some change and some don't. Right.
It's like it's like if you knew with someone who
was in a band that blew up right, and then
some remain very humble, and then some change. I mean,
(03:19):
that's an honest answer, I think. Um, but I remember
when the change started happening, which was super interesting to me.
Like I remember one day when I think it was
it was Kevin from Instagram, when he came in and
all of a sudden he had more PR handlers within,
and then there were just more layers and layers and layers.
And it used to be you just would email one
person and they would just respond backre emailing, like the
(03:42):
founder of Oover just just at up the interview, and
then all of a sudden there were more PR handlers
and more pure handlers, and then all of a sudden
there was a multibillion dollar company. And so it was
super I mean, it was super interesting. I interviewed Mark
Zuckerberg during UM Cambridge Analytico, which was such a such
a interesting I'm in tech, and it was I think
a time when we realized how much of our data
(04:05):
was not as private as we thought. It was terrified,
which was really terrifying. And so it just the whole
ethos of technology changed, and it changed our culture, and
it changed everything we did and everything got a lot
more complicated. It really did. And I saw, I saw
the social dilemma terrifying but so important to watch. I've
(04:27):
I've turned off all my notifications on all my apps
except Gmail because I need to for school. But it's
it really makes you look at how we've come far
as a society with the technology. When you were, you know,
in two thousand and eight and two thousand nine, did
you ever think that technology would have gone to such
a place where it would take such a factor in
(04:48):
polarizing our country? Um? No, I didn't, but I did.
I did see it happening before it came right. Um,
you know, I remember. It's funny. I was looking through
my notes, um from a talk I gave to some
students at m y U. And this is years This
must have been like two thousand and twelve. It was
like a bunch of engineering students, and I was looking back.
(05:10):
I'd say this piece of paper and I had written
things to look out for, um, and and the number
one thing I wrote I have written on this was empathy,
Like you've got to have it you, We've got to
start thinking about empathy really early on. But I remember
being like, I think it's really important for these engineers
to start thinking about the human impact, because there was
always this kind of ones and zeros like text is
(05:32):
going to change the world, and this black and white
view of it, um, and sometimes just like the human
thing got lost. And every time I had like a
bad story about tech or I was kind of like,
you know, saying, okay, did you not see that this
was going to happen? It was always kind of these
human questions. And so remember, um, you know, to this
room full of engineers being like, you really need to
think long and hard about the human impact of the
(05:53):
algorithms and what you're doing. And that was years before
this happened. Um. And I remember beginning to see the
cry acts in the in the system, but kind of
before it happened, and um and saying, you know, tech
is not taking responsibility for the content on its platform,
and the questions are becoming more and more complicated. And
I remember sitting across from Mark Zuckerberg and saying, how
(06:16):
does it feel to be editor in chief of, you know,
of the Internet to some degree? And he didn't like
that question, But I don't think I was ever mean
and asking these questions as much as and this has
never been my style as a journalist. You know, they're
tough of fair questions. They're fair questions that we're right
to know. Yeah, And I always thought it was important
to ask founders, especially because I grew up with these
(06:37):
founders and I believed in their products, and so I think, Um,
I did see that polarization was going to happen, but
to this degree no, And could anyone have envisioned it? Um?
Probably not? But could could tech have done a much
better job? Yes? Could they still be doing a better job? Yes?
Like do I buy it when they? Um? You know,
(06:59):
when I hear the company lines we take user safety
very seriously. I mean, like, no, I'm trying to find
a balance of having a healthy relationship with my phone
and with social media devices. But you know, we are
in a global pandemics and now aren't we of communicating
is through chechnology? So how have you tried to find
a balance between living a life, even though it through
(07:22):
a pandemic in a healthy way and in in technology? Not? Well? Like?
Not well? I mean, if I'm being auf Dick, I
could say I could be doing a much better job
of it. I told you as I'm writing this book,
like I had to delete Facebook and Instagram for three
or four days because I am not good at the
middle ground, you know, like I'm just not and and
(07:43):
so I think that's going to be a problem for
these for these companies, um, because I do think we
were all in at this point, you know. And and
it's been programmed, um, as you probably saw in the
Social Dilemma. You know, they've been programmed every product decision,
whether it's like the color of the notifications, it's programmed
to make our brain go buzz, you know, to get
(08:04):
us to look at it. That part of the Social
Dilemma was terrifying. So if they're programming things for our
brain to respond to in a certain way, how do
we take back control. I think there will be a
new conversation and a couple of years, I think tech
is going to be a part of us, Like we're
not going to be able to just turn it off.
Like I do not believe. Um. And maybe this is
(08:26):
a little controversial to say, but like we're in it,
Like we've opted in. We are all Like technology is
inherently a part of humanity now and so to some degree,
like we've got in this next phase, we've got to
learn how to interact better with it. And we need
new entrepreneurs to build better products with humanity first, and
and so that's a lot of responsibility. I am. I
(08:48):
interviewed a guy years ago who did UM predictive data
analytics to determine if something like really bad was going
to happen UM, like it's suicide bombing or something like
awful UM. And I remember in the middle of the
interview we never we never published this part of the interview. UM.
He was like a human algorithm. He didn't really have
(09:09):
much social audility. And in the in the middle of
the interview, he was like, you know, I looked at
all your data, um, everything you've posted on Facebook and
Twitter and Instagram over the last eight years. And he's like,
and I know a lot about you. And I'm like, well,
what do you and uh and he was like, well,
you're unhappy in your relationship and you're growing unhappy at
your job. And I was at cn IN for ten
(09:31):
years and I was like, and by the way, like,
well i'll be I'll be honestly, both of those things
are kind of like true, right and where it hurts, Yeah,
And I remember I remember being like, ah, I mean
it could be like a terror card reading, right where
you're kind of like, okay, like you are unhappy in
this way and you just kind of go with it.
But but like it did a terror card reading, you
(09:51):
know what, that's what you're there to do. You're here
to get this information in an interview and he goes totally.
But but I you know, when I left the job
and I left the boyfriend at the time, um, you know,
I called him up and I had said, like, how
did you do that? And he said, every word you post,
the time of day, you post, the types of words,
everything is an indicator. It creates this digital puzzle piece
(10:14):
of who we are and what we're not saying. And
so I think there's actually something powerful there, right, Like
I don't think it's completely totally freaky, right, Like I'm sorry,
I do want to scare all your no, no, but
this is important to know. But like advertisers are already
are already doing this to us, right, So what if
in the future, like we were able to some degree, um,
to be able to take advantage of that information for ourselves,
(10:37):
Like what if we had the capacity, what if down
the line, like we had the capacity, um, to understand
more about our own data, about our mental health and
what it said and so and so. I only say
this to say, as someone who has looked at the
future and the zeitgeist and what people will be talking
about down the line, I think there will be a
new conversation around technology and how do we take all
(10:57):
of these things and make them work for us as
opposed to us working for them. Yeah, take the power back? Yeah, like,
you know, bring back like I'm going to bring back
my old punk ROCKERSLF, but like, yeah, like, let's take
the power back in some capacity. That's so interesting because
it makes sense. You know that it is programmed to
benefit the companies and the data. It's to make it
(11:19):
so we can have that power. You know. Um. I'm
actually in my statistics class. We're doing this two month
project where we are we choose four metrics about ourselves, um,
and we record the results every day for two months
and then analyze the data to know more about ourselves.
And one of them for me is checking my screen time,
(11:39):
and I've been very it's been making me much more cautious.
Two years ago, my softmore year of high school, my
anxiety got really bad and I shut off my phone
for two weeks because I realized I was spending like
thirteen hours a day on my phone. And we're at
a point in I think technology where when I was
in that vulnerable moment, it can be really at social
(12:00):
media and technology for all of that. So when you
say compressing our data so we can know more about
our human behaviors, do you think that would benefit us
if for the people that are in vulnerable situations? I mean, look,
I think it could go pop ways. And by the way,
this is technology in a nutshell, Like I think having
a lot of this data could be super helpful, and
(12:22):
then I think it could also hurt us too, you know,
And I think that is where ethics are. And this
is why, like the sweet spot of my career has
always been ethics. Like I've always been screaming about ethics
as long as it's very underrated. It is really underrated.
It's like not that sexy, but like now it's becoming
more sexy because all these tech companies are under fire. Um.
But you know, it's it's the theme that kind of
(12:42):
goes missing. Um. And especially is something as powerful as technology,
like I don't even think you know, in the time
I started covering tech. It's not a beat anymore. It's
just humanity, Like it's the way we love, it's politics,
it's it's mental health, it's it's everything. I think. I
think it's a really good question, and I think and
I don't think it has an easy answer, and I
think the best questions don't have easy answers. I'm still
(13:03):
figuring out what I want to do, but I really
admire you as a journalist and as a reporter, because
that's something that I'm becoming really interested in. Um and you,
like I've said, you've been following this for so many years.
When you interviewed Mark Zuckerberg about the Cambridge Analytica, First
of all, that interview was incredible because that was like
the first time I really saw him take accountability to
(13:27):
some degree. How do you go into that interview and
did you expect to get those answers? It's an interesting question,
you know, I go into any high pressure interview like
hoping if so. First of all, I think it's always
in any in any interview, always about to follow up, right,
Like I always know that they have the thing that
(13:49):
they want to say that they need to say, So
you just gotta let them say it and then you
gotta listen, and then you gotta follow up. Like that's
that's always my key, So like, you know, and I
also under dan that like you've got to know Um.
I remember remember thinking like, um, I had messaged Mark
on Facebook. Um during that irony people think that, I mean,
(14:11):
by the way people think these things are so difficult,
um and by the way it's hard to get an
interview March sure like but but people also like it's
like people forget what technology you should just like you know,
a lot of people have like these booking departments that
book for the would be on it is his platform.
I mean, doesn't it make sense, like a as someone
who's covered technology for a really long time, so you know,
(14:33):
and then I had followed up with this pure person
but um, I remember going out there um. And I mean,
by the way, it couldn't have been more dramatic because
like Al was coming in um into New York. Like
I think my life is constantly a bit of a
It's like my life is a bit of a sign
Feld episode always. I feel like if we're Anderson Cooper,
like it would have just been like the like and
like you know, you know, and maybe you could imagine
(14:56):
went wrong. Um, but I do remember going in and think, um,
this is such a big moment. And and it was
the moment for me that technology had become society because
everyone cared like It wasn't just my inside baseball tech
friends that cared like like Facebook had managed to piss
(15:16):
off so many people in this and and and it
just was such a large many people's data and so
many people and you know, and their messaging had not
been good and also just an understanding of what went wrong.
And I just remember going in and and um, thinking
about how small the room felt, you know, and how
(15:36):
big the story was. Like it was like the small
cold room because Mark likes it really cold interviews, um,
and so I was really freezing, and then we had
the scene and countdown clock and I was just um,
but I just remember I wanted him and this is
important to me, and I'm sure you understand. This is
someone who interviews people. I wanted him to feel like
himself because if he felt like himself, then he would
(15:58):
say the things he needed to say. Got to make
you feel comfortable, yeah, and you know, and and but
it was also it's always so balanced because you know,
I wanted him to feel comfortable, but I also you know,
there was a lot to speak to and there's a
lot of accountability in that moment um and and so
it's interesting because we we did the interview um and
(16:19):
and we kept going because we went past our a
lot of time. Um But he started, you know, we
really started talking to this PR person cut you off
and be like it's time to go, or did they
just let it happen? Yeah, I mean I think like
because they were only still a chat like twenty minutes,
but I think he really started, you know, and he
was really kind of I think getting his rhythm too,
(16:39):
and so he went past what we were supposed to talk.
So we spoke probably for thirty minutes or even more
um and and you know, when he was answered, it
was the first time they said he said that Facebook
should be regulated. It was um he said he would
be willing to testify. And so we made a lot
of news and by the way, that a lot of
that came from follow ups um and and really kind
of pushing him on some of these things. But also
(17:01):
because we had the time too, and we took the time.
And by the way, I'm thinking, well, I think I
have to be live in like an hour and a half.
How are we going to do this? But you know,
and and but it's also kind of like everything else
disappears and um, and it's just you and that person.
And it's also understanding the context of what's around you
and um and not walking in And I've never been
(17:22):
kind of a gotcha person, and I don't believe in
taking the cheap shot. But I also don't believe in
just listening to people kind of you know, saying their
sound bites. And I also think people are more understood
when they get beyond their sound bites, Like I actually
think like I actually think they will be more happy
with the interview if they're able to actually say more too,
and you can push them and challenge them in the
(17:43):
right way. So I think we got into a good
cadence and I think it was a historical moment in tech.
I think that you know, it's like an almost famous
like the guy with his like eyes wide open, like
where you gotta you gotta just like you know, everything
changes and you've got to ask the hard questions and
and tech has become for all the good it has
created a lot of bad and there's a lot of
accountability to be had and a lot of complicated questions,
(18:05):
and I want to be the one asking those questions.
Do you think, in your personal opinion, because you've been
doing this for a long time, do you think it
was meaningful? Do you think it was um, just another
way to take accountability but not truly need it, because
that's what happens a lot. I think that I think
it was meaningful. I think that Facebook was going through
(18:26):
a huge transition at the time. I think that Facebook
had to grow up, truthfully, and I think they were
growing up. I think that for a very long time
they didn't understand um, you know, that they needed to
be more open with the media and more with other people,
that that kind of thing. And I think that, um,
that all changed, UM after so much went wrong with
(18:48):
the election and you know, in with Cambridge Analytica it
was one to three and people lost their patience, UM,
and so they had no choice and they had to
start putting Mark out there and Marked have to be
out there before, you know. But it's interesting because he
hadn't done tons of press. So by the time he's
starting to do press, like UM, I had this joke.
(19:08):
It's kind of like puberty's painful, especially when you're prompting.
It's like, you know, it's like you're going out there.
It's one of the most powerful people and now in
the world, but you haven't been out there much. And
so it was such a fascinating, um fascinating thing to
to be a part of it, especially because I think
I have a nuanced view of technology just because I
kind of grew up in it. We have to take
a quick break, but when we come back, we're gonna
(19:30):
be talking more about how do we know what's true
on the Internet, how we can take back to power
and the incredible documentary and the Social Dilemma, and we're back.
What are your thoughts when Mark Zuckerberg went before Congress
and talked about his defense in not fact checking people's posts.
I mean, how do we make sure we get the
(19:50):
correct information out there when the founder of Facebook won't
even ensure that? Well, I think it's it's a hard
it's a it's a hard question, right because, um, you
Mark has the famous line if he doesn't want to
be the art of truth, right, like, and you almost
to a degree, like, do you trust Mark Zuckerberg to
be fact checking things like you know, and and for
(20:11):
for a very long time, it's like what I think,
it's a larger conversation of what what role and responsibility
do you want to put on technology? And I think
it's an important societal question and it's one that is
like completely clashed in the last couple of years because
for so long when I interviewed tech founders, whether it
was you know, Jack Dorsey Williams from Twitter or any
(20:31):
any of these guys, it was we're just the pipes,
were not responsible, um for for the content on our platforms.
Well that's all changing now, you know. And and now
because you have misinformation, you have so much stuff that
has gone wrong, you have hate that's literally turned offline,
gone viral, and is like changing user behavior, changing elections.
You know, it's time for a change. And and so
(20:55):
I think there's a larger conversation now about what should
those laws be and should be tech be regulated, which
I think the answer is a hard yes. But what
kind of regulation is you know, is the right type
of regulation and who should be doing it and what
is right and what is wrong exactly. I think it's
been interesting in this election to see. Um, when Donald
Trump tweets, you'll see Twitter will will put it makes
(21:19):
me laugh so hard. Yeah, you know, but but that
wasn't the case like four years ago. And and even Facebook.
You know, it's interesting to see how they both handle
them different But I mean, there's so much more pressure
on it now. But misinformation is you know. I think
at the time I've been covering uh tech, like even
this idea of truth has been so just and what
(21:42):
is truth anymore? It's so scary to see that we
are all living in these different realities, in this utopia
of technology that was going to connect the world, and
that that these hippies that I knew, who were engineers,
who were musicians with tattoos and thought they were going
to change everything, have created some of the most profound
problems that we will be facing uh and and that
(22:04):
will become the problems that we and our children will
deal with, and that have fractured society in a way
that is so fundamental. It is extraordinary because they did
not think of the human impact of the algorithms. Um
and and so do I want Martin Zuckerberg fact checking me? No,
But do I think that they could have done a
better job and a more consistent job and thought of
(22:26):
these questions earlier A thousand percent. Yeah, and I think,
like you're saying, it's really scary that, like in the
Social Dilemma show, you could search climate change in like
New York and then search it somewhere else in like Alabama,
and completely different results will show up. So it really
does blur what the truth is To speak to that, like,
(22:47):
how do we even find the truth? Now? How do
we know what it is? Right? Well, it's funny because
it's like we have our version of the truth right,
and like it's like what we believe is true, and
then it's like the facts and and facts right, you know,
and then you have I mean, by the way, I
did a lot of work on Q and on and
looking at Q and on and conspiracy theories and whatnot.
(23:08):
Which is this group that was amplified by the way
by Facebook, um of you know, folks who just completely
do not believe and any version they believe their own
version of what they believe is true. Um. And it's
just extraordinary. Yeah, I mean, I think you have distrust
and mainstream media now in a way that you have
(23:29):
never ever had before. And if that's for many reasons
I believe. I think mainstream media has changed quite a bit.
And then you also have the president who's attacked mainstream
media for the last four years. And so all of
these things have come together and the Internet, it's like
created this perfect storm, um, and this moment in time
amplified by technology. It's so fascinating how much of a
(23:52):
role technology plays into all this because someone might not
think it at first glance. Something I find really interesting
is that, like you said, mainstream DA it's very controversial now,
but for example, like when Fox News declare that Buyen's president,
are they fake news? Now? It's like that blurred of
what is personal reality? Right? You know, I actually think,
(24:14):
um down the line, Um, I have a company called
dot dot dot Media the right, We're we're a niche
kind of media company. But I actually think there's gonna
be a lot of room. Um. And this isn't just
being plugging or doing I genuinely think, and you see
it popping up all over the place, Like I think
we're going to see a lot of alternative channels um,
(24:34):
of people just living in their own version you know,
um for better, I am for worse to how do
we think that because technology can be very scary, and
I I think we need to create do everything we can.
So what can the people listening do that feel helpless
because this is very scary and it could go down
a very dark path. But how can we reverse that?
(24:55):
How can we make it so technology can be used
to impact society for the good and the impact for
the negative much less well? I think also, I mean,
I think even being a little more cognizant of what
you're reading and what you're tweeting and what you're sharing.
I mean, I know that sounds so simple, but even Twitter,
I think recently made a product change where instead of
just retweeting, they ask you or you know, they do
(25:18):
you want to like quote something first and then retweet it.
Same with Instagram it's a few stories. Yeah, you know
that that product decision actually made a huge difference in
people spreading this information. So, first of all, that can
come from the top tech companies can't do something and
they should be doing something. Shouldn't just be lip service,
you know, UM, And then I think people have a
responsibility to um. But we are in a moment of pain,
(25:43):
like we are in a moment where people are so
angry where everyone is on their own sides. Like I
even think, um, you know, with this election and with
people out celebrating or being UNHAPPI or you know, I
think part of part of I am part time with
a new Initiative for sixty minutes and I spent time
with militia groups and Q and on and the group
(26:06):
called the Boogle Boo Boys who are caring around an
A R fifteens and their very anti government and wow, no, yeah,
I think Wednesday, you know. But you know, but the
thing is, I think people, um, and pain is real,
and people have lost jobs, and we're in a tough spot,
(26:27):
and the internet makes a lot of that worse. And
I think, uh, empathy. I still go back to that
line that I um that I wrote that I was
trying to talk to entrepreneurs about. I think, um, you know,
if we want to get out of the polarization, we've
got to talk to each other to some degree, right, Um,
you know, I think that that's important and not just
stay in our own lane and stay in our own side.
(26:48):
And that can be not just on the Internet, but
in real life. Believe it or not, have imagined that, Um,
you know, listen and and and listening. You know, even
if you don't agree with it, I think tech companies
have a huge responsibility. I think they will be regulated,
and I think they need diversity. And I think they
needed more people that aren't white men truthfully. UM. And yeah,
(27:10):
I mean I think that's important too. And I think, UM,
there's a lot, there's a lot of work to do.
I mean, I wish I could have one answer and
wave my my magic wand and fix. And I think
you need nuanced content. I've always believed in this. I
I was never a fan of just doing two minutes
on TV of like open here it is. So, is
that what you're trying to do with your media company?
(27:31):
Dot dot dot from me? UM, looking at technology through
the looking at humanity through that lens, looking at issues
like the unintended consequences attack, looking at issues like love
and politics and you name it. Mental health is a
big one, UM. Looking at these corner stories and the
people that people kind of turn away from UM and
(27:53):
being able to tell their stories in a more interesting,
nuanced way that's not people shouting at each other. That
allows us to hit it from all sides. UM. That's
always been something I feel pretty strongly about and hearing
those stories from the powerhouses in Silicon Valley, but also
the people that are ignored. I got the name. I
remember someone telling me I was the human equivalent of
dot dot dot, Like you know, when you're texting someone,
(28:14):
you're a waiting and like what are they gonna say? Um?
I love that. Yeah, I'm like, I feel like this
moment is so dot dot dot right, Like we don't
know what's going to happen. Um, We're just kind of
waiting for it to play out, and the anxious like
when I say, there's so much to say, um, and
(28:35):
we just don't know what's going to be said. So
I'm kind of okay with not tying the bow. I
think that's the story of my life. Like I'm totally
okay with not tying the bow. That's hard for me
because I like to know things and I like to
have things planned out. So it's it's it's actually your
worst nightmare, it really is. That's why the time that
we're living very scary for me as someone who likes
(28:57):
you know what's happening. Yeah, Okay, we have to take
one more quick break, but when we come back, I
want to know who your best interview was, your worst interview,
and your toughest interview you've ever had. We'll be right back,
and we're back. You've interviewed so many people, from Mark
Zuckerberg to Bill Gates? What has been your toughest interview,
(29:22):
your favorite interview? And if you can answer your worst interview? Um,
my favorite interview, my worst interview. UM, well, I guess
I'm gonna talk about it probably in the book, so
I guess I could. I couldn't give some hints at it.
Um my least favorite interview. UM. Maybe I won't say
his name because that's going to come out in the book,
but a founder of a big tech company that we
(29:45):
all know. He I remember asking him about women's safety
because it was at a time when the company was
dealing with many issues. This was like an online company, um.
And I remember sitting in the newsroom with him and
I put him on camera before a lot of people.
It was one of those big, big tech multimillionaire about
(30:05):
billionaire founder and I remember him taking off his mike
and saying Lorie, like LORI and I was like, what
you know, and he's like, I didn't know this was
that kind of interview. And I was like, what do
you mean what kind of interview? Um and uh, and
he was like that kind of interview. And I remember
him talking with turning over and talking to his PR
(30:27):
person and discussing if he was going to leave um.
And I think for me that was the moment that
I don't say, it's fine, like ship out real, It's like,
that was the moment where it was like, wait a second,
I'm not here to be your mouthpiece. Right, I've always
been fair. I've always treated people with respect. I've always
(30:47):
asked the hard questions, but I've always you know, I've
always treated people with kindness and with respect. Um. I
think for me that was that was a pretty extraordinary
moment because it showed this it was it was when
the minnows turned to sharks. Like when Silicon Valley, I
think it had officially kind of crossed over from me
and not everyone in Silicon Valley was like that, right
(31:08):
at all, by the way at all. But the arrogance
there that that this person had a company that was
worth more than like Campbell soup, right, you know, and
and that I didn't think that he needed to answer
questions on on basic questions like what about like we
(31:29):
take women's safety very seriously, like you know, and and
I think that was so that was probably my worst.
I mean, he stayed for the rest of the interview,
but that got weird. Uh, let's see. Toughest the tough
interviews are when people don't want to say something and
they need to say something, you know. The tough interviews
are a dance and like the other person and you're
just like you feel like you're doing like like yeah,
(31:50):
you're doing like a ninja dance and they're just like foxtrotting.
I love that comparison, you know, and it's like that's
that's this eats up and only specifically through that either
I don't don't think that at me, but I can't
right now. And then the good interviews for me, I
love interviews that enabled people to some degree to take
their power back. Um. You know, I did a lot.
(32:13):
It wasn't just the tech founders that I interviewed, um
that were my favorite interviews. Like my favorite interviews were,
you know, women who have been you know, I did
a whole special back in the day when people didn't
even know what the word like revenge born was, which
is like a horrible type of harassment, like mostly against women.
Were like men post images of naked women online. It's
(32:34):
like a form of like power and all sorts of
stuff and and it wasn't even in the dictionary at
the time, and women couldn't get their images off the
internet were ruined lives. And I remember interviewing a woman
named Nikki who was who was so scared to go
on camera, and when I did put her on camera,
she said, every time I look at someone, I wonder
if they've seen me naked. And I couldn't even envision
(32:56):
what that would feel like, in the shame and the
humiliation it would be for someone, um, you know too,
and what that would do to your psyche right to
walk around and wonder if people around you had, you know,
had seen you like that. And it was just such
a the type of harassment it was that this guy
(33:16):
he had taped her without her consent, all those types
of stuff type of stuff. It was just such a
form of power. And when she we did this interview,
it was almost like pointing the camera at her with
her consent, right, like she was able to take back
that power back. And I really that I liked and
so like I've interviewed hackers and and UM and victims
(33:37):
of UM, you know, of crimes and all sorts of stuff.
So I think those are probably more my favorite interviews.
And then some founders, I yeah, and founders I think
are also really interesting. Life is messy, and I like
people who talk about it, you know, it's so interesting.
I've been into UM San Quentin the prison and interviewed
(33:57):
a lot of prisoners about UM. They have a really
interesting startup program where they teach them how to code there.
But I've interviewed a lot of them about their crimes
and what makes them, you know, what makes them a
good entrepreneur. Also, so there's like a fine line between
that thing that makes you a good entrepreneur, like and
and also what can if those skills used in a
bad way can be very very bad And and so
(34:19):
I also I've always been a big believer and kind
of walking into situations that you might that might make
you uncomfortable and and going in in like a very real, genuine,
non judgmental way and trying to understand human nature. I
think technology has always just been my way into talking
about the human condition. Condition. If I do my job right.
You mentioned a book. Is that coming soon? Yes, I
(34:42):
have a book. It's coming, Um, it's coming in auguste.
So hopefully things will be better and I can go
on book tour and you know, hopefully by then we'll
we'll have well, you know, be uh will be in
a better position in society. But it's going to be
a lot of these stories. I'm yeah, I'll send it
to you. It's it's gonna be called dot dot dot.
(35:02):
So it all it's all about Yeah, it's very un
frand so it's all about um, you know, tech and
society and that's the second wave of tech and asking
for what you want. And it's it's my story, but
it's really kind of the story of um of kind
of this the second wave of tech. So it's it
should be hopefully exciting. Well, I am so excited. And
like we've talked about, has been a crazy year. Um
(35:24):
and I've been feeling very helpless like many others yea,
And so I wanted for this season, I wanted to
highlight a charity each episode so we can bring attention
to some good in the world. So I was wondering,
if you're a charity that you're passionate about that we
should highlight and talk about. You know, I've always I've
always supported charity water, but I feel like, I mean,
I also feel like there's room for for other ones.
(35:46):
So if I come up with another one, I'm going
to send you some other links. If that's okay too, Okay,
we will include it and I will link it in
my bio and it's we'll mention. Okay, wonderful, wonderful, um,
and thank you. I appreciate your time. It's been fine, awesome.
It was so great to meet too. Thank you so
much for taking the time. Yeah, I appreciate it. And
I will be listening. I love your stuff, so I'll
be listening to all your interviews. All right, guys, thank
(36:11):
you so much for listening to this week's episode of
Let's Be Real. As always, don't forget to subscribe to
the podcast if you haven't already, leave a comment because
I always love to hear your feedback. And don't forget
to follow me on Instagram at It's Sammy J. That's
I T S S A M M Y J A
y E. And also go follow Lorie Siegel on all
her social media. It's Lorie Siegel. And please check out
(36:33):
her multimedia company dot dot dot. It's really cool and
I think you'll like it. And if you're still here,
I have a little sneak peek for you because next
week's guest is singer, songwriter dancer Tate McGray and it's
an awesome conversation and I am so ready for you
to listen to it next Tuesday. Stay tuned. Bye guys,