Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:10):
Hi guys, and welcome back to another new episode of
You Need Therapy Podcast. My name is Kat and I
am the host, and you guys are in for a
treat today. I am so excited about this conversation. Before
I get into talking about our guests, I do want
to say really quick, remember that this podcast is not therapy.
(00:31):
It's just a therapist hosting a podcast, and you know what,
it might lead to therapy, but it's not. And now
that we have out of the way, we can get
back to why I'm excited for you guys to be
able to finally hear this conversation. I've been itching to
get it out as soon as I had it and
my guest, which I'm so grateful to have had. I'm
so grateful to be able to have the conversation. Even
(00:52):
if that was a private conversation it wasn't for the podcast.
This was one of those where like this challenged and
changed my life and I really just loved and it
was a conversation with a freaking badass woman named Nabi
has who is the president of the Markup, which is
a new investigative journalism startup. We talk all about that
in the episode, so I'm going to let the conversation
(01:14):
we have do the work there. But I want you
guys to know before you walk into this, how freaking
amazing this human is one you're going to get to
know this because she's captivating and she's easy to understand,
and she she has a presence and an energy that
you just want to hang out with her. Side note
like I want to be her friend. I quesse she
lived in Nashville. If you ever come to Nashville, please
(01:37):
call me. But she is such an intelligent, accomplished human
and also humble because you did not hear her talking
about how great she is in the episode. You just
heard her being a human. But she has done a lot.
She worked at BuzzFeed for a while. She was the
vice president and associate general counsel at Busfeed and was
(01:57):
the company's first newsroom lawyer. She has been described as
one of the best emerging free speech lawyers by Forbes.
She's worked at leading media law firms, and she has
worked on legal access issues at Quantamino Bay. She's represented
asylum seekers in South Texas. She's counseled people on whether
to publish hacked materials, spoke about misinformation at the inaugural
(02:19):
Obama Foundation Summit. She's incredible and for a lot of
all of her work. She's been named forty under forty
Rising Star by the New York Law Journal. She was
a finalist for the Outstanding Young Lawyer of the Year
Award from the International Bar Association, and she has in
seventeen I mean, she is one award after award and
was named one of Forbes thirty under thirty. She holds
(02:42):
a law degree from Yale Law School, and she's just incredible.
Like I'm not even naming all the awards because she
had so many. And I just want you guys to
know that before we go into this conversation, because she
is somebody that I want you to trust what she's
I want you to walk into this being able to
really trust her. Now, I don't think you would even
need to know that, because her presence is a very
(03:04):
just like captivating one, like I said, and I immediately
was like trusting her. But she's just freaking cool. She is,
and she's like has a family and as a mom
as a two year old, and she lives in New
York and she's just great. So I'm excited about this conversation.
We talk about social media, We talk about tech and
how it sometimes is a scary world and how to
(03:24):
gain power and agency back. It is a I opening
conversation has changed the way that I am looking at
and approaching things in my life, and I just loved
having it. And on that I was having a conversation
with a friend today about Instagram and how I feel
like sometimes it feels as though everybody is privy to
(03:45):
all of our lives, like every part of our lives,
because now we have the ability to share all of
our life with everybody, and so I just feel like
this is a conversation that is is so needed and
so special and so important, and it talks about how
we can gain power from that feeling back again. And
I'm excited to have this conversation so I can go
back to it and remind myself. So I bet you
(04:08):
guys can tell in the energy of my voice, like
I am so excited to share this with you. So
I'm gonna stop talking about it, and I'm going to
let you guys hear my amazing conversation with Nabiha. Thank
you for having this with me. You are incredible, And
for everybody else please enjoy. I hope that you are
just as captivated and inspired as I am. So here
it is. Okay, well, welcome to the show. Thank you
(04:31):
for having me. I am such a hyper fan, so
I can't wait. Okay, Well, I love that because I'm
a hyper fan of you. Um, And this is gonna
be a conversation that I think is going to be
different than any one that I've had yet. And I'm
going to give you full disclosure. I understand about negative
of anything that has to do with like the law
(04:55):
and a lawyer and that kind of stuff. And recently
I had to hire a lawyer for some stuff I
should have done a long time ago, and it just
is so confusing to me. But I feel like you're
the kind of person who can help it makes sense.
I love breaking things down so and I love explaining things.
And it's honestly, the law is confusing, and especially if
anything with free speech or tech or like privacy, it's
(05:18):
confusing even for the lawyers. So we're going to do
our best get into it. You've done a lot of
impressive things and exciting things that I want to hear about.
I want us to full circle come and talk about
the Markup. But I also know that you did a
lot of things before that, including kind of what you
were saying and working at buzzfeeds. I'm very interested in
your history, what you were doing in BuzzFeed and all that,
(05:42):
and then what brought you full circle. So can you
give us like a history of in the internutshell my life?
First things first, I grew up on the Internet in
a way that I should not have. My parents got
these a O L free CD thirty day c d s.
I don't know if anyone else remembers those. We can
get the lid c d's mailed to you. So you
could have dial up modem internet through a O well,
(06:05):
and I got that in second grade. And so I'm
just like totally unsupervised because my parents were like internet computer,
this is educational. So I grew up loving the Internet
like I was with the kids would now call thick,
but that was not the word we used in the nineties. Um,
and I had never seen like bodies like mine except
for on the Internet. I'd be like, oh, like, here's
(06:26):
all kinds of people, all kinds of shapes that I
don't see on Self magazine or Vogue or certainly TV.
Or like anything in Orange County, California, where I grew up.
So I always loved the Internet. That's the journey that
went on. Now. So when I, you know, I went
to law school, I was so interested in how the
Internet was changing these expectations of what the law prescribed
(06:50):
for like newspapers and TV. There's a whole new medium
that had needed whole new rules. So I went to
a law firm in New York Times and when the
opportunity came to go to BuzzFeed, it felt like someone
was saying, welcome, would you like to come to the Internet.
Do you want to be a lawyer for the Internet.
And I was like, yes, I do. I've been waiting
for this since so thank you, Oh my god. But
(07:13):
what I what I also just heard you say is
when you were introduced to the Internet, it was like
a really safe, helpful tool. Yeah, I mean it was
a wild It was definitely a wild wild West, right
like early nineties. Internet through a O well had a
whole lot of stuff that I probably shouldn't have seen either.
But there was also it was like message boards. So
I used to go to this place called chick Click,
which does not sand as nefarious as the phrase sand Now.
(07:37):
It was a feminist message board that was just like
all kinds of wonderful women talking about all kinds of
things in their life, like hey, my breasts are uneven.
One's bigger than the other, which is like all kinds
of things that Like I was the eldest girl in
an immigrant household, Like I'd never heard this kind of stuff,
and so read this kind of stuff, right because everyone
(08:00):
was reading at that stage in the Internet, you weren't watching,
And I think that's an important description. You'd see like pictures,
people post pictures of themselves and I'm like, okay, alright, alright,
it's normal to look that way. It's normal to be
like this, it's normal to hear about feminism. And that
part was amazing. I guess it was new too, So
there was all this like craziness out there. There was
a ton of craziness, right, So and be reading something
(08:20):
and all of a sudden be like, oh what did
I just be? Oh no, well, I guess I remember
when I was younger, all the pop ups that would
they were like I was like, wait, I didn't want that. Yes,
You're like, oh do not want that? Click x out
x out I do not want this, but it somehow
felt felt more organic and kind of unintentional when that
(08:40):
pop up would come up. And now you know, these
platforms feel very curated, and then it just kind of
felt like turned a corner, saw something weird, turned that
corner again, like keep going yeah, okay. So then so
you got to BuzzFeed and you're like, I'm going to
be a lawyer for the inner exactly what does that
even mean? It meant that I was around to answer
or any kind of questions that people had. So and
(09:02):
people are like, hey, can I use this music in
the background of a video? And I'm like, well, let's
think about what Madam Queenbiance might have to say about that.
The answer is no, she does not want you to
use her songs in your video. But also really interesting
things like when we had reporters who wanted to dig
into allegations of sexual harassment happening at schools, right and
(09:23):
they're like, I want to and this is before me too.
So you have these super talented reporters like just Testa
Katie Baker, who before the sort of me too movement
in the press, were like, how do we report on
what girls in schools are saying about their teachers and
their principles. Knowing that we get the story wrong, the
(09:44):
school is going to sue us. How do we appropriately
reach out to these students and like, you know, give
them a microphone for their truth while also understanding the
complications here. And so I got to advise on those
kinds of stories too, which is really meaningful for me,
and you know, hard national security stories like publishing all
kinds of things about politics. But I was the My
(10:06):
job was to be the one that made sure people
didn't get in trouble for speaking truth and the way
they wanted to, which is really cool and also like
that's kind of scary too. Yeah, I thought it was
a lot of fun. I mean, it's really fun when
you work with reporters who you trust and you know
where their heart is and you know where their head is, right,
So if they want to make sure that the right
people stories are getting to the Internet, it's honor to
(10:28):
be able to help. Yeah. So there's this one part
of of the Internet and all of the things that
have been given to us and offered and been able
to be created for the good. Like it's an easier
way to access information. I talked about this all the time,
even with like TikTok and like Instagram, how that's changed.
I'm so grateful and like podcasts like, I'm so grateful
that there is a way for people to get information
(10:51):
that otherwise would never have, that we can be more
educated and we can learn more and it can connect us.
But then there's also this space where it's like, wait
a second, is this a conspiracy theory? Who's brainwashing us?
Is this information true? Why should I believe this person?
Anybody can say anything? Why am I getting this ad? Okay,
now I bought this thing that I got this ad
seventeen times? Like did I even want that? What do
(11:13):
I like? What should bodies look like? It's so so
much and so I don't even know the right question
to ask. But how are you sitting in? The Internet
is good? Internet is scary that it makes a ton
of sense, and it's the trickiest because here's how I
think of the Internet, especially these days. You know, like
every science fiction series, um, there's always that episode where
someone can start listening to everyone else's internal monologue and
(11:36):
they start hearing all these things in their minds and
they're like, oh, no, I'm hearing everyone else's thoughts. That
is literally what the Internet is. You're hearing everyone's thoughts.
And the only way to manage that is to build
defenses and structure around it, right to say, okay, so
let me tell you my TikTok role. My TikTok role
is that I can only watch TikTok at the end
(11:57):
of any hour for leftover time until the top of
the hour. So if we were to finish at eleven
fifty seven, I have three minutes and I can just
be on TikTok because youve got to control your time otherwise,
these platforms are designed for you to give them all
of your attention and all of your time, right, So
you need time structure. I also think you have to
remind yourself often that a lot of what you see
(12:20):
isn't real, and you have to remind yourself that constantly.
And that's really tricky because you want to relate and
connect to people. That's what's beautiful about it. But it's
kind of like your friends, right. You have people you
run into, you have your acquaintances, you have your friends,
and you have your besties. That's a structure that we
have in our real life. You've got to bring that
(12:41):
to social Not everybody is your best, no matter how
much they say they might be. You've got to understand
who's like you're you're the D team, Like, I'm not
going to trust everything you say. You're around sometimes you're funny.
I'm not gonna like hang my life on what you
have to say. And just like you do that in
real life, you have to do that here too. What
what do you think because I think that's hard for
people to uh to actually put into action. It's like,
(13:02):
I know that not everything I see on Instagram Israel.
I know that there are filters. I know that, yet
I still get tricked. This is not even a professional question,
but in your own just personal opinion, why do you
think it's hard for people to remember that. I think
it all boils down to to proximity and relatability. And
here's what I mean by that. Magazines existed for decades
(13:22):
and decades. We all saw images of people who were
something to attain, right. I would look at tier banks
and be like, I don't look like tire banks. Fun fact,
no one looks like tire banks. So that was a
very easy to be to be like, oh, you're over there.
But these are devices we hold in our hand all
the time. It's not up on a billboard or on
a TV. It's the same device you used to text
(13:44):
your mom or your best friend. It's the same device
you used to send messages to people you care about.
It's so intimate. The proximity makes it really get into
your mind. And the other thing is the relatability. Right
when I didn't look like tire Banks, it was okay,
I was like, I don't go to school of tire Banks.
I don't really see her at the grocery store. But
what you see for so many influencers on Instagram and
(14:05):
TikTok is the entire brand is based off relatability, right,
Like here I am looking beautiful at the grocery store,
and that gets into your brain right where you're like,
am I supposed to look great at the grocery store?
Like I found a band aid in my own hair
when I was at the grocery store yesterday, so like
that's that's where I'm coming from. And so it's like
that sort of relatability of seeing this kind of appearance
(14:30):
situated in otherwise where you want to looking things, and
that just gets it will just get in you, and
I think you have I'm really ruthless about on following
if I'm like, oh, I had a bad feeling when
I looked at this, even if it's a little feeling
of envy, a little flicker of jealousy, and just like, unfollowed,
get out of this. I also a a Saudi fashion
(14:50):
blogger that I follow who is amazing. Her name is
Ala Baffi. She's like at Allah. She had this really
interesting experiment over the summer where she unfollowed everybody on Instagram.
She has hundreds of thousands of followers. She just went
down to zero because she wanted to see what it
was like to create in a space where she didn't
feel like she was getting inputs from other people. And
(15:10):
she did it for a month, but she said it
was like a palate cleanser. And so I think you've
got to be super in tune with where you are
and then know that you might need those retreat moments,
whether it's quitting the app. I find that hard or
that kind of like just palate cleanser, like you gotta
do what you need. Well. I find it hard to
to quit because that brings in this like black and whiteness,
(15:30):
and for a lot of people, like there's so much
good that can come from this and so much education
and so much connection and so and creativity and fun
that when we are like I'll just I'll just quit
for six months, okay, But why because then when you
add it again, what's going to be different exactly? And
that's why I so I'm not I I can't quit.
It's just that's not how I work. So I'm all
(15:50):
about putting in structure around like what are the times
of day where I know, like this is like fun
for me? Right? Who are the people that I find
really generative and enriching, of which there are a lot,
And who are the people that sometimes I see this,
I'm like, I just feel bad, Like I just I
just feel bad when I see this, right, Um, And
(16:12):
for like for me, it was and this you know,
just personal, like for people who are really into documenting
their weight lost journey all the time and like what
they're eating and stuff. I'll just be like, you know what,
good for you. I wish you well for what you
want to do. I am not going to be on
this like national geographic like voyeuristic endeavor for you I'm out,
Like I'm out. I'll follow you back later, but not
(16:33):
right now. I can't. Yeah, and that doesn't feel good
for me to watch that. No, it doesn't. And again
I don't want to police or judge what anyone wants
to do with their body. But I don't need to
have a front row seat to this. I'm glad you
want to share. I don't need to be here for it.
Thank you for saying that, because I think a lot
of times when you unfollow somebody, I feel like we're
getting off topic. But I don't even care when you
(16:54):
unfollow somebody. It's like, you know, people have like apps
that they can see like who unfollowed them, which I
think that is so unhealthy. But again, do what you
need to do. It's like that is like an attack
on me, like they hate me or they don't like me,
or how could they unfollow me? But that doesn't mean
that what you just said is, I'm not saying she's
doing anything wrong or bad. Do what you need to
do to be the person you want to be. I
(17:14):
don't want to watch it because it doesn't feel good
to me. It's boundaries, period, exactly, period, And that's just boundaries.
And I feel like what I am really intrigued by
is as we come to these realizations that tech is
like can be good, can also be really bad for
people's self esteem. I am seeing from a lot of
entrepreneurs and innovators I know, attempts at solving this through
(17:39):
providing more opportunities for structure. So what does that mean.
I was just talking to this person who was like,
wouldn't it be cool if I could create a plug
in into Instagram and you could just decide that I
don't want to see anything that's like hashtag weight, lost, journey,
hashtag shredded for summer, hashtag whatever, and some people I follow,
it would just like it just block those out for me, right,
(18:00):
I'm not there, And that way it sort of speaks
to what you're saying. It's like, I'm not on following you.
I'm just saying I'm not here for that hashtag right now,
like I can't be Yeah, And like I have a
friend who unfortunately like went through a miscarriage and she
just was like, I can't see everyone's baby announcements right now,
Like I just I'm not there. I wish there was
(18:22):
something that she could just be like, yeah, that's not
right now, just not right now for me. I want
to be on the platform, but just not that. But
I think there are people who are dabbling in the
in those kinds of solutions, and I'm excited for that. Yes,
I want that. I want that right because it does
feel like how do I set up my boundaries? I
can set up my boundaries of who I follow, who
I mute, how much I'm on it, But sometimes it's
(18:43):
just like the pop up, It's like I didn't want
to see that, Like I'm not here for this. I
didn't want to see that. And I think, but this
is what I think is so good about surfacing these harms, right,
about having these news articles, of having people talk about
this openly, because once we identify what the harms are
and how people are honestly feeling, we can design around that.
(19:05):
And for some people, look, depending on the platform, it
might be yeah, maybe that shouldn't exist, but I think
there might be a layer below that, right, that's like,
how do we make these platforms responsive and designed for
people so they can set their boundaries in a healthy way.
(19:26):
So I didn't say this when we started, but my
best friend connected us Tory Pool and she's been on
the show before. She's freaking amazing. She's awesome. She's like
you she does seven dred things and I'm like, how
are you doing that? I text her the other day,
I'm like, I can't even make my bed and you're
planning a birthday party, a christening. You work a full
time job, you teach its soul cycle. You're freaking smart people.
(19:48):
How do you do all this? She is amazing though,
she I mean, she's like the next level amazing human.
She really is. So anyways, when she was talking to me,
I was like, oh, this makes me think about that
documentary that came out. I guess it was Last to
Your The Social Dilemma. I didn't watch the whole thing
full disclosure, because I was like, this is information overload
and now I'm scared. But what I remember from the
(20:08):
beginning of it is it was a bunch of people
who helped with the start of these big I think
it was like Twitter and like Google and stuff like
that that were then coming and be like, Oh, we
didn't mean to create this. This is not what we
meant to create. And so I want to kind of
talk about that a little bit, and then I want
(20:29):
you to tell us about what the markup is and
what what that is doing, because what I saw with
me and my own clients, what I was working with
two When that came out, everybody was like, I can't
be on the internet anymore, and I was like, well,
we have to, So what do we do and how
can we help ourselves? The boundaries and all that over
over there. But kind of what I was talking about
in the beginning too, it's like I find myself buying
(20:51):
things that I'm like, I didn't want this? Why is
it in my home now? So I don't know what
the question is, but can you tell can we talk
about that? Well? First, I think I think we're in
a really exciting and interesting moment right because these conversations
are happening, And what these conversations mean is Okay, does
it have to be this way? Right? Does it have
(21:12):
to be this way? Is this inevitable? Is this the
future we're in? And I think if enough of us
are like, no, I don't want this. I don't want
to be persuaded into mindlessly buying things or feeling bad
about myself because of something I saw, or questioning whether
anything is real. Because all of the information is out
there if we don't want it to be this way.
(21:32):
First things first, we have to understand how we got here,
and that goes back to the first part of your question, right,
Like all of these people built these systems. They built Google,
they built Twitter, they built Facebook. I don't think this
is where they wanted us to be. They made a
ton of money getting here, but I don't think that
was the dream at the outset. So we have to understand,
(21:53):
like what happened? Right, How did we get from this
sort of like fun wild internet to this place that
feels like a digital prison for our minds and our behavior.
Like what happened? And so there's a lot of interesting,
I think things to say in this space. One artist
that I love. I didn't think a lawyer will talk
about art, but I do love art, and I think
(22:14):
it's important in this time. There's this artist, his name
is Ben Grosser, and he has this exhibit up called
Software for Less, and in it he redesigns all of
these platforms to be optimized for less, not more. So
imagine these platforms like Instagram, they want you to post
all the time, right, The algorithm is hungry. You've got
to feed it constantly. What would it look like if
(22:36):
actually you signed up for Instagram and it just gave
you a hundred posts for your whole life and said
you've got a hundred choose wisely, what do you want
to share? Like? Think about how the incentive for growth
was built into this platform from day one, and that
may create a whole set of incentives and patterns and
structures on the app. But if you had designed that
(22:58):
day one differently scarcity and not growth, how would that
feel different? How would people approach things? And I love
that because a lot of times the scarcity mindset leads
us down. We talk about like dating or taking jobs
or this or that. Sometimes it's not great. But with this,
it's like, wait, that would be helpful. I wouldn't be
worried about every day I have to post five things
(23:18):
so people see my stuff. It's like, I don't want
to post this because it's not that important, right And
I think it's because I shared the same perspective on
scarcity mindset. It often leads to dysfunction. But the one
thing that is truly scarce in this world is actually time.
Like you can't get more, you can't get anything, It
is actually the scarce thing. What could it look like
if we designed for the future and said, let's design differently, right,
(23:41):
And that's actually exercise we're in at the moment. It's
thinking about what the design should be because what we
have isn't working. And that's where the innovation around. Okay,
if we want more boundaries and structure, how do we
design for control? How do we add those layers on it?
How do we grow up? Right? Because what we're seeing
is like Internet is an adolescent, it's young, and we're
(24:03):
growing up and figuring out what boundaries we need. And
I think that is what's really fun about this time. Wait,
can we do that? It feels like we're going backwards,
but can we be like Instagram? We're changing? Like, how
do we do that? So I think that's that's so
fun about this moment, right, because it might not be Instagram.
It might be that Instagram decides we want to stay
the way we are, sorry, take it or leave it,
(24:24):
and then there's enough. There are so many people working
in this space, we're like, Okay, first of all, you
may get sued by the Federal Trade Commission Instagram for
a variety of your privacy practices and like, there's all
this legislation and other government movement to break up these platforms.
So maybe that if they decide to stay the way
they are, they're not allowed to. It may also be
(24:46):
that enough people want other competitors, that people are like
I don't want to be on a platform that's like this,
I want to be on this different kind of place,
and that we see that movement happened too, because remember
my Space was a big deal not that long ago, right,
everyone was on my Space? So everyone was and now
we're like, who what what is my space? Right? So movements,
(25:06):
but we got to remember that too. We do have agency,
we do have control, and quitting one platform doesn't mean
quitting the Internet or social It means that we might
evolve to something else. Okay, that is something that I
think we need to remember. Going back to my example
of like why do I have this shirt in my closet?
I don't like it and I bought it. I'm like
I just want this ad to go away almost, or
like I guess I need this, But I do have
(25:29):
agency and control and that platform sometimes makes me feel
like I don't. So rather than Instagram's changing, if we
all start because we a lot of us are feeling
this way and something else is created. We have the
power to say I don't want to use this platform anymore.
There's something that fits my needs. I have power, yeah,
And I think that is the thing for people to remember.
(25:49):
And these platforms don't want to remind you that they
have power, right, that's not in their incentive. They don't
want you to feel like empowered or that you have agency.
But that is the reality of this moment, and that
we've done this before. People left my Space and they
went to Facebook, and people went left Facebook, and they
went to Twitter, and they went to Twitter, and they
went to Instagram, and they're leaving Instagram. They're going to
(26:10):
TikTok and it'll be something else we haven't heard about.
Like this happens, people do it. And also like the
smartphone is just over what ten years old, ten fifte
years old? Like this isn't gravity, okay, Like this isn't
a lot of physics. This stuff is choices and they're hard.
So I want to just be very honest about the
fact that it's hard, but we we can do hard things. Right,
(26:31):
It's hard, but we can do hard We can't we
can't and Also, I just have a question that I
don't know if you can fully answer. With all of
the stuff on those platforms, specifically Instagram and Facebook, what's
the legality that's even the right word, Like for them
to gather all our information and then expan us like
that feels wrong? Yes, And so this is I think
(26:52):
an area where I call it regulatory imagination, Like legislators
and regulators are imagining what they can do in this
space in a way that they not ten years ago, right,
because everyone's like what can we do? And so for
many of these platforms, they're looking at, well, what is
the data you're collecting from children? Is that allowed? Do
we allow you to collect this much information from kids?
(27:15):
Do we let kids say, um, hi sorry, Facebook, I
turned eight team get rid of everything that you had
about me from before, which is actually a law in California, right,
you're able? Yeah, And so like states are experimenting with this,
Europe is experimenting with this. So to answer your question,
like what's the legality here, we get to make it
and everyone is trying, right. So in in Europe there's
(27:37):
something called the g DPR, which is a data protection
regulation and they're implementing all kinds of ideas of hey,
what does content moderation look like here? Should a human
being be the one who's deciding to take something down?
So Facebook can't just say, oops, sorry, plus sized woman,
I took your picture down by accident because the algorithm
(27:57):
did it right. How do we in pose this kind
of structure on what these platforms are doing. This is
all happening in real time now, and everyone's trying different things.
But there's a lot of rules around privacy and the
collection of data to target you. That targeting information be like, hey, Katherine,
I think you're gonna like this pair of like wide
(28:18):
like pants, Like you're gonna like it. I'm going to
just push it to you a thousand times a day.
There is a lot of legislative scrutiny of do we
want to allow that, Should Catherine be able to opt out,
should she be able to say no, I don't want
you to have I don't want you to set target
me with clothes. I don't want you to target I
want you to get rid of all the data you
(28:39):
collected about me for this. I want you to only
be able to use data that you got from this app,
not buying it on the open market to target me
as a profile, you know, as a profile. All of
that's happening now, which feels good. When I was making
one of my websites, I was working with a business consultant,
and I have a newsletter. Well I had a newsletter,
(29:00):
sign up for it. I'm gonna get it back back
and running soon. But we were creating like the pop
up forward and whatever and then opt out whatever he
and he. I don't remember the language because it's over
my head, but I just remember him being so serious about, well,
they need to be able to opt out or you
can't just take their stuff or you can't. And I
was like, why other people do it? To me? Like
(29:20):
I just wanted to know why. I'm a why person?
And I'm like, I get emails all the time, and
I've never signed up for this email subscription. I've unchecked
the box when I've checked out at let's say, like
I don't know, like Lulu Lemon, I actually checked the
box for that. I want those emails, but like there's
plenty of things that I'm like, I bought something from
here one time. I know I didn't check that box.
Why am I getting four emails a day? That seems
(29:42):
like it should be illegal. Yeah, and it is, and
so I think the eskay, And the question is because
there's all of this sale happening of your data. But
those opt outs are real for a reason. And when
people do don't comply, right, which happens so they're like, oh, sorry,
we're gonna keep emailing you. They can and do get
in trouble by state regulators and federal regulators. The problem
(30:02):
is often had the fines haven't been that much money,
and it's like jaywalking, right, You're like, act like, what
are the chances I'm gonna get in trouble this time?
But the more attention we put to it, the more
the public is like, hey, I don't like this and
I'm going to report you. The more room you're giving
these regulators to be like you know what, Yeah, last
(30:23):
time we find you five hundred dollars and you company
X you didn't care because five hundred dollars to you, Like, okay,
you sell you know, five pairs of Lulu Lemon leggings.
And there we are like that's fine, it's worth it.
Let's raise the fine amount. Let's stop paying attention to
lobbyists who are saying no, no no, no, no, it'll self correct.
Enough time has past. You didn't fix it on your own.
(30:45):
And that's really like, I want people to be so
empowered by the moment that we're in because there were
all of these beliefs that people had before. It'll fix itself. Eventually,
people are still figuring it out. There aren't that many
bad actors that now we're like, no, you had enough time,
we're changing it now. Yeah, okay, well then can we
(31:06):
now talk about the Markup then? Totally? Okay? Tell us
what that is and why you created it and all
the things. The Markup is a new media organization that
is looking at a very simple question, how is technology
reshaping our world? And that means like the big tech
that we've been talking about, Amazon, Google, whatever, but also
the tech you haven't heard of. Right, So, technology like
(31:30):
universities using these things called student risk algorithms to determine
who they think is at high risk of failing out
of a particular subject. Surprised, the algorithm will say things
like women and black people usually aren't in STEM, so
they're at a high risk of failing out. I mean,
really like the algorithms are making these sort of stupid
I'll call them stupid. I was gonna say that doesn't
(31:50):
make it doesn't make may sense. Here's why they algorithms
are making very stupid interpretations of historical data, saying, well,
we haven't seen women and people of color in science, technology, engineering,
and math, so we think that they're rare, so we
think they're at a high risk of failing out. Now,
(32:11):
you and I as human beings can be like, yeah, no,
that's wrong a billion different ways, right, Like, there's a
million reasons why the representative data of the past is
not actually indicative or instructive for the future. But again,
like a lot of these algorithms are stupid, right, Like
they're trained on the data they have and they make
inferences that are not full and robust, and sometimes they're nonsensical.
(32:32):
So this is an example of actually reporting that we
did earlier this year. We identified one of these algorithms
that had this kind of output. We published the story,
and we told four hundred different colleges and universities that
we're using this algorithm. Hey, you might want to take
a look at this thing that you've signed up for.
And I want to give major props to Texas and
M University that day was done. We're not using this algorithm.
(32:55):
Thank you for telling us. We did not know that
this was the conclusion that it was spitting it out
because if you're you're like sitting at the school, you're
just getting reports from this algorithm. Okay, you don't know
all of the ways that it's flawed. You trust it.
And so that's what we're doing. We are here to say, hey,
there's a lot going on behind the scenes for big
tech and this this other tech you don't know about,
(33:17):
and it has real impact on people's lives, on their schooling,
on their education, on their mortgages, on their businesses, and
we're gonna tell you what that is so you can
make better choices. And that's what we do, which is
so cool because is anybody else doing that. You're like
trying to You're just trying to help people. We're just
we we call it service journalism, right, it's journalism as
(33:37):
a service. We're trying to help you. And one thing
that our team does, and like our team is just
full of rock star journalists, data scientists, technologists, engineers, like
they're amazing. They'll build these tools, like one tool that
I love is called black light anyone can use it.
It's the markup dot org splash black light. You can
put in your favorite websites. So put in girl scouts
(33:59):
dot org. It will show you all of the data
that's being collected about you when you go to girl
scouts dot org. So, talking about the data collection that
we just mentioned earlier in the conversation, there's a lot
of places you go on the Internet and you don't
think that data is going to be collected about you,
like you're just visiting a website. So we built this
tool to help people understand in real time for the
(34:20):
sites you go to, what are they they're taking about you?
And we did a big investigation using this tool, and
we found things like when you go to an abortion
provider website, the data they're collecting on you. Now, I
bet you if you go to an abortion provider website,
that's not a moment in which you think someone's collecting
data about you. And we know that data could be
extremely sensitive health wise, I mean now criminality wise, if
(34:45):
you're sitting in Texas, like, collecting this data has consequences,
and we want people to know what's happening and where
Wait a second that I feel fear right, because I'm
like no in the sense that like, if I go
to this right now, well, can I test us our
while we're talking? Okay, so the markup dot org slash blacklight.
(35:05):
I go to there and then I type in a website.
Just enter a website. Okay, what's a website that I
don't even know? Well, let's just go to lulu Lemon. Oh,
that's a fun one. Now I'm curious about Lulua Lemon.
I haven't tried that one, and honestly I should have.
I also like their leggings, I know. I'm like please. Okay,
So I just type You'll see testing for ad trackers,
third party cookies, tracking the evades, cookie blockers. So you
(35:27):
may think I have a cookie blocker. I'm fine, No, no,
If there's tracking the evades, that session monitoring scripts, key
stroke capturing, Facebook and Google, this website sending dad about
your Facebook. Wait a second, Wait, okay, here's a question.
I have I log in through Facebook and so many
things because it's just easier. What is that doing? Is
that giving Facebook all of the information they have agreed
(35:50):
to gather from that website about you. So let's say
you use Facebook to log into like a calendar app
that you love, Everything from that calendar app is now
also going to Facebook. And this is one reason Facebook
has so much data because every time you put on
a Facebook little tracking pixel, you know, like like me
(36:11):
on Facebook, right, if you put that on your website,
they put a pixel on that site and they collect
information from all of your people that come to your website.
Stop it. Also, I don't mind still loading? Is still loading?
Mind still loading? It is probably because too much stuff, man,
I hope not. When it is done, we'll it just
says it's currently testing, so we'll just we'll just well
(36:33):
just wait for it. But it'll tell you like six
ad trackers, this many cookies, this many of this, it'll
tell you all this stuff. And if you scroll down
you can also see the articles that we wrote. Like
so one notable one is the high privacy cost of
a free website. So we get into you know, these
websites look free, but what's actually happening like a WordPress
(36:53):
like or Square dot com like those three things anything. Right,
Like you go to a website, you're like, I'm just
checking you out. However, there may be this invisible trade
that happens of your data and we just want you
to know about it. And you'll see our approach is
very much like I'm not trying to give you no solution.
So another thing you'll see in our website is at
(37:14):
the bottom of the black light page. I scammed websites
I visit with black light and it's horrifying. Now what
we always want to give you, and now what I
don't want you to be sitting there on your sitting
on your hands, terrified, trying to throw your computer out
the window, right. We want to give you things you
can do about it. So our advices. You might want
to switch browsers, you might want to add some privacy extensions.
(37:37):
We give you some examples of them in the site.
So we just want to give you what you need
to be able to take control of your life. You're
not saying don't ever go to this website again. You're like, Okay,
now you know, you know you have awareness. Here are
things you can do to protect yourself. I just think
scaring people and not giving them a pathway forward leads
to that sort of binary black and white thinking that
you mentioned earlier. It's like, it's like, well, now you know,
(38:00):
so if you care about it you're going to opt out,
and that's not real life. For real life doesn't mean
you can opt out. And by the way, in real life,
if you were driving on the road and there was
a pothole and you drove on the pothole and you've
got a busted tire, imagine if people were like, that's
what you get for driving, Okay, you shouldn't be driving
on the road. That it would be insane. We would
look at that person and be like, you are bananagrams.
(38:22):
So that's not a real that's not a real solution.
We would say, there's a pothole, I'm calling the city government.
I'm letting them know about the pothole, and I want
to make sure that pothole is filled. That is what
I want us to move to. The sort of civic
understanding and responsibility of the internet as a utility. We're
all using it, okay, so we want it to be better.
We're not saying stop driving on the road. We're saying
(38:44):
fix the road right because we can't not drive. We
can't not dan we can, but like, but what's there's
what's the consequence of not driving? So what's the consequence
of never being able to go on the internet again.
It's I mean, it's just it's not realistic. And if
if that's the advice that you give people, they're not
going to take it. And so they're just going to
either ignore the issue or feel bad all the time
(39:05):
that something is being done to them and they can't
stop it. And that's that's not the society that I
think we live it. Like I, I believe that people
have agency, and so one of our amazing engineers has
this phrase called agency not apathy. We want people to
feel agency, not apathy about what's going on, and that's
the heart of the markup. I like this idea because
(39:26):
I had the feeling of like, I'm going to see
all the stuff that these websites are taking from me.
But that fear can be a motivator in the sense
that I'm scared because I care about what people are
doing and what's happening. And so rather than I'm I'm
feeling fear, I'm just going to hide, I'm gonna never
drive again, I'm gonna whatever, I'm never going to enter again,
it's I feel fear, I care about this. Therefore I'm
(39:47):
going to do something differently. And you're saying here are
the tools and here's what you can do, which thank
you for doing that. You're welcome. So black lights one
of the tools. There's three that I saw. Can we
talk about that, of course? So we have black Light.
We have a tool called split screen, so you know,
(40:10):
we all kind of intuitively know that our neighbors might
not be seeing the same thing on Facebook that we do, right,
Like I have my portfolio things that I read that
I'm targeted with news wise on Facebook, and I know
that other folks that I know they're getting news from
very different sources. So we have this tool called split
screen that is a almost real time tool to see
(40:32):
like what are women seeing on Facebook? What are men
seeing on Facebook? What are boomer seeing on Facebook? What
are millennials seeing on Facebook? And the way we're able
to do that is that we actually built a representative
statistical panel across the country and said, hey, folks, use
this tool that we made called Citizen Browser. We're not
going to collect any of your private information, but we
(40:54):
want to know what news articles are being pushed to
you by Facebook because it's interesting to us. We want
to know just how different are people's news universes. After
something like the January six incident, right, So like what
are Trump voters seeing in terms of the news that's
been pushed to them, and what are Biden voters seeing? Right?
(41:15):
And so we built this to answer those questions. We
kind of all intuitively new, right, like, oh yeah, you're
seeing different stuff than me. But we wanted to know
the answer with data, right. We wanted to know it
with science because we think if you know with certainty,
then you can fix it in reality. But if you
just operate with these stereotypes, right, like, it's not you
can't move the ball. And all I want to do
(41:36):
is move the ball to a better future. So anyone
can check out the mark up split screen and just
kind of see like what's going on. Well, and that
is fascinating too, because I'm a big believer in Again,
this goes with everything we're saying, but like, you start
to believe what you see. So if I'm this demographic
and I'm getting all this, and let's say my grandfather
(41:56):
is obviously a different demographic, no wonder we have different
views and we have different information when we're talking about
something that's happening in the world, because we're both getting
different information and probably an influx of like not even information. Agree.
What you see creates your reality. So we have to
understand what are these platforms doing to create that different
(42:19):
reality and then be able to call them on it,
right to be like, hey, you set Like an article
that we wrote at the Markup was, Hey, Facebook, you
said that you weren't going to push election related information
election news right before the election, right like Mark Zuckerberg said,
We're not going to recommend these political groups is what
(42:39):
he said. We're like, okay, but we were collecting data
with our split screen, our sitism browser panel, and we
were able to see the in fact, you were pushing
these Facebook groups two people right around the election and
continuing afterwards, Like you said that you weren't going to
but you did, So what are we going to do
about that? Yeah, And that's a difference of like, Okay,
(43:00):
anybody can say anything, but we're showing you guys what's
really happening exactly. I appreciate that. Okay, I have my
black light thinking up. I did Athleta instead. I don't
know why you have Lulu Lemon on your I just
closed it out. I have New York that's fine, which
has a lot, so you can help explain this to me.
Say's one ad tracker found on this site. This is
(43:21):
less than half of the average of seven that we
found on popular sites. What is that? So it'll tell you, Um,
if you click on it, it it can give you a
little bit more explanation, I think. But we see that
on average there are seven on different websites that are
around the world or around around the Internet, and so
Athleta on that front is doing better than the average.
And that's like a good thing to know, to be like,
(43:42):
all right, thank you friends. And then when you click
into it, it will give you an explanation of like, well,
what's an ad tracker? Right? Because again, like I don't
want to assume that anyone knows anything about the internet,
because we all have full lives. Okay, people have expertise
in different things. It's not your fault that you don't
know what a cookie is or an ad tracker or something.
(44:02):
It's not anyone's fault. Our job is to be like, Okay,
well you want to know, here you go. Yeah, I
appreciate that because I have no idea, okay, And it
says third party cookie. Fourteen third party cookies were found
this is more than average of three that we found
on this site. Okay, what does that means? So a
third party cookie is something that tracks you around as
you follow the internet. So like, let's say you go
(44:25):
to one website. If there's a cookie on that website,
it can see not only like what are you where
are you scrolling? What are you clicking through? What kind
of leggings did you linger on? What what did you
put into your basket? It can do that for that website,
and potentially, depending on the cookie, you can also follow
you to other places that you go, even when you
leave Athleta dot com. Okay, so they're not doing good there.
(44:46):
This website loads trackers on your computer that are designed
to evade third party cookie blockers, sneaking. Yeah, I don't
like that. That's all it found. So that I mean,
I guess that's generally not horrible. So you said New
York Times is pretty scary, and New York Times wasn't
the greatest although but you know, but but like here's
the thing, So they have third party cookies. Remember, the
(45:06):
third party cookies can actually be good. And this is
where the explanations matter. So let's say you put something
in your basket at Atlanta and then you like click
away for a second and you come back and remembers
stuff in your basket. Okay, well cookies are important for that,
but that might not be a third party cookie. So
you want to What we want to do with black
Light is that we want to give you the opportunity
to like, okay, you have twenty four cookies. What are
(45:29):
these cookies? Right? Like, let's dig into this. So when
I put an athleta and went to athleta dot gap
dot com, and it tells me that the cookies are Twitter. Okay.
All that means is they have like a little Twitter
thing on their on their site, so you could click
Twitter and then go straight to their Twitter. That seems fine.
New Relic, who's that been? Tellect? Who's that Monotype? Imaging? Okay? Microsoft?
(45:53):
Oh I know them. Maybe it's related to service they're
using Verizon? Okay, I pawn web. And then you look
at all these names and you're like, who are these people?
Let me google them? And so what we want to
give you is like, look, we can't tell you everything
you need to know, but we can tell you where
you can keep going. So maybe you should go Google
b Intellect Inc. And see what they do. You might
(46:13):
decide that they're okay. You might be like, I, actually,
they're fine. I don't care if they or Verizon or
Warner Media have my information. That seems all right, But
we want you to make that decision with agency, which
is probably part of that article that you were talking about,
is like it tells you what to do with the information. Okay,
So that's just interesting. I'm going to play with that
all day now. I feel wonderful. What's the best way
(46:35):
for people to go and get all this information? Because
I know that I'm very interested in it and want
to learn more, and I'm assuming people hearing this are
going to be too. So we have a newsletter called
Clauson that you can sign up for, and Claqueson will
just send you a little newsletter, like just a little
message every time we've a new article. We published twice
a week, so it's not again, it's not like a
ton to keep up with. We want to make sure
(46:57):
it's a manageable amount so you can actually wrap your
mind around it. And we also have a series called
Ask the Markup. So if you're like, um, this thing
is happening on the internet, I've seen it. I've always
wondered what is you know, what's the deal here? Um,
you can send in those kinds of questions to us too.
We had a question where someone was like, Hey, what's
(47:17):
going on with Venmo? And like everyone can see my interact,
my transactions. Just like explain Venmo to me. And that's
article that we did. We did one about Klarina and
after pay and like anything that you I was reading
the after pay one. I think I was reading that
people like we were here for people's questions. You know,
it's complicated and so we want to put the tools
(47:39):
in your hands to make better choices. Let us know
how we can help. I'm interested in the Venmo one
because why is there a social feed on Venmo? You
can turn it off? It makes no sense, so I
didn't know you could. You can turn it off, You
can turn off. You can go through privacy settings. Yeah,
but there you go. So that's the thing you can do. Now,
go in through privacy settings. You can turn it off
so people don't see things. But like you just snoop
(47:59):
on everyone's life if they haven't turned it off, or
you're like, okay, and why do I need to know that,
and you know what that makes me think of, like, gosh,
like dating, And you know, I'm a therapist, so I
hear a lot about people's dating lives, and and I'm
a human, so I talk to people and hear about
their dating lives. And I have one too, But like
the amount of people that are like I saw so
and so pay so and so on benmo, and I'm like,
(48:20):
oh my gosh, Like we have enough ways to stalk people.
We don't need to know who's paying who for tacos exactly.
And then because I hear it from my friends all
the time, like so and so had the little champagne
emoji with another person, what's that about? And it's like,
you know, it's probably nothing. And actually, we don't need
to live in a surveillance state where we're watching every
(48:42):
single thing what every other person does. Just yeah, don't
do that. We don't need to do that. Well, I've
loved talking to you so much. This is great. I
feel like I'm going to have about five questions for
you after I dig into all this stuff. That's what
I love. I really am so grateful for all of
this because I knew this was going to be a
different kind of episode, but the amount of time I
(49:04):
spend talking about the media, social media, just all this
stuff with people in therapy and just like in my life,
like we need people like you who are like helping
us keep all of this but in a way that
is helpful to us. What I've gotten out of this
conversation is like, we don't need to like kill the internet.
(49:24):
We just need to remember why we have it, what
we wanted to actually do for us, and how do
we put our efforts into that direction. That's absolutely right,
that's and that honestly, like that's like a our new
mission statement, but that's what it is. It's like preserved
the good, get rid and fixed the bad, and we
are in a really special opportunity space right now to
(49:46):
do that. So let's do it. Let's freaking do it.
I've been getting in my head the past probably a
couple of months of being like I hate TikTok. I
hate this, but like I don't hate it, but like
I feel like I have to hate it because of
the little bits of things that it doesn't help. And
now I feel like refreshed. I like I don't have
to hate this stuff. I just have to remember why
(50:06):
can it be good? And how do I put my
attention to that? Thank you? Also, we can talk at
another time about how much I love TikTok and how
much parenting advice I get from TikTok. We'll see how
that turns out in eighteen years. Well that's the thing,
like we should we need to talk about that another
time because I love it too. And I love it
because it helps me, like, you know, things that aren't
(50:27):
so serious. I can laugh at things and it's funny
and it's easy way to access that. It helps people
be creative and it gives there's so much good to it.
But I also see so much misinformation that I'm like,
stop saying that. And how many clients send me TikTok's
and are like, is this true? Why have you never
told me that? And I'm like, because it's not true.
I never said think it's not true. Also, people sending
(50:50):
me TikTok and I'm like, I don't think that's information
that I want you to have right now, Like it's
information that I would give you in a session where
we could process and all. There's so much to it.
I talked about it all the time, but it can
be used for good. There can there's I mean, I
feel like there's a whole other hour on like the
difference between expertise, knowledge and information. The Internet gives us information,
but not expertise and not real knowledge because knowledge is structured. Right, No,
(51:14):
this at this time and then this and then this.
That's it. Yeah, we can talk about that at another time. Okay,
I can't wait. Then that last part. I'm like, okay,
that's what I needed that in words. Well, thank you
so much, thank you, talk soon, all right, bye,