Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet, as a production
of iHeartRadio and Unbossed Creative. I'm Bridget Toad and this
is There Are No Girls on the Internet. Welcome back
to There Are No Girls on the Internet. And this
is another installment of our weekly round up where we
dig into the stories that you might have missed on the.
Speaker 2 (00:23):
Internet so you don't have to.
Speaker 1 (00:25):
And I am thrilled to introduce to my guest co host,
Sammy Canter, founder of Girl and the Gub, one of
my favorite newsletters. Thank you so much for being here, Sammy.
Speaker 3 (00:34):
Oh gosh, thank you so much for having me. I
am such a fangirl, so apologies in advance because you've
seen me fainting over here to be on the show Beyond.
I'm so excited to get into it all. So thank
you so much.
Speaker 1 (00:47):
Oh my god, I'm very excited to have you on.
You're one of my favorite Instagram follows. You do something
that I think is really tough that I aspire to do,
where you find a good way of blending all of
the fun of social media with also the politics and
government stuff that people need to know about. I often
feel sort of torn between, well, I want my audience
(01:08):
to know about this, like they should know about this,
but I don't want to bore them with some like
government topic, even though it's very important. I feel that
you have really threaded the needle on social media of
how you do.
Speaker 3 (01:18):
That, Honored. First of all, we'll be clipping that and
sending that to anyone I ever pitched for funding in
the Mirror and distant future all at the same time.
But yeah, it's an interesting way of looking at politics
or trying to really essentially get it in front of
new eyes. And we've learned anything from the last number
(01:39):
of years is that the old formula isn't working. And
what I really try and do, whether it's on and
star TikTok or YouTube shorts for God's sake, is figure
out a way that it slips into people's day to day.
They're radars what they're scrolling through anyways. So if that's
an esthetic outfit video that also happens to have three
(02:01):
news stories on it in a way that's explained how
we actually speak, then by all means I'll keep doing it.
Speaker 1 (02:08):
Okay, Well, the first story I want to talk about
I think kind of fits in your wheelhouse, and that
is this story about influencer vibe theft.
Speaker 2 (02:17):
So the question is.
Speaker 1 (02:18):
Sort of, can someone sue you for copying your vibe
or aesthetic on social media?
Speaker 2 (02:24):
Did you hear about this story?
Speaker 3 (02:26):
I did, and so what was interesting is I heard
just sort of the general overview and the People article,
and I opened my phone maybe an hour before actually
we hopped on here, and the influencer that was suing
was giving her feel about like what actually went down
and how she felt like the People article and also
any other sources I covered it were like missing the
(02:48):
beef of it, the real, you know, the real core
of what the lawsuit was, which I found so interesting
and does also shows for that angle, like how sometimes
things can get lost in the social media sauce aka
Where's Nuance? Miss? Her Nuance is MII.
Speaker 1 (03:04):
So in her TikTok, she says that this is so
much more than her trying to copyright a beage aesthetic,
and that the media is sort of failing to tell
this story accurately because they're focusing on the most ridiculous
parts of her suit, the beige esthetic part of it,
and ignoring the more interesting meatia aspects of it.
Speaker 4 (03:22):
I believe her intention was to look so similar to
me and copy my post so similarly that she could
profit off my business. A lot of articles are claiming
I'm suing over a beige aesthetic. I have never claimed
to own beige, and I'm not suing anyone over a
color or a trend. The story the media is telling
is missing so many details, and the way that they're
spinning it to make it seem like it's all about
(03:44):
an aesthetic to create drama is misrepresenting the importance of
this case.
Speaker 1 (03:49):
So the meat of what's happening here is that it's
a lawsuit where one influencer is not just saying that
somebody else is mimicking her aesthetic, which is this beige
neutral vibe, but going so much for to mimic many
aspects about her business as an influencer that she argues
is being done intentionally to create brand confusion and siphon
from her business as an influencer and content creator. So
(04:11):
while you might have heard of this lawsuit as oh,
this influencer thinks she owns the color page, and yeah,
that part of it might be kind of funny, the
issue is so much deeper than that. It goes to
can someone essentially steal the entire look and feel of
someone else's business, even if that business relies on visual
aspects or visual aesthetics that may not traditionally be able
(04:33):
to be copyrighted as somebody else's intellectual property. You know,
the way that somebody wears their hair or a specific
tattoo or a specific manicure, or a specific pose and photos.
So I loved both of these influencers up and it
is uncanny, like it goes so much further than just
a beige Look, here's what went down. Influencer Sydney Nicole
Gifford did a joint photo shoot with twenty twenty three
(04:55):
with another influencer named Alyssa Shiel.
Speaker 2 (04:57):
After this shoot, Gifford's.
Speaker 1 (04:58):
Claims that she'l started copying her sort of minimal page
aesthetic and specific poses in her content. But it's not
just about the specific pose, because when she would do that,
you know well worn influencer pose where you have the
phone in front of your face in a mirror, like
obscuring your face.
Speaker 2 (05:16):
The two are essentially identical, like it is very.
Speaker 1 (05:20):
Difficult to tell one influencer from the other. Like she's
not wrong, it is uncanny and if you're making money
from your look as an influencer, and somebody else comes
in and starts doing that exact same thing to maybe
create a little bit of brand confusion, I get it.
Then she'll blocked her on Instagram and TikTok. So Gifford
sent a season desist, and those season desists were ignored.
(05:42):
So Gifford filed a first of its kind federal lawsuit
in twenty twenty four a legend copyright infringement, trade dress misappropriation,
and unfair competition. So, like you, Sammy, I don't think
of this as like a frivolous lawsuit. Like I was like, oh,
this is deeply interesting because as we know, influencers and
(06:02):
content creators are a big part of the lifeblood of
like e commerce, and so it's like how brands like
Amazon become very economically successful. So we're not talking about
just like, oh, she stole my Look. What she's arguing
about is like a very suppicific kind of copyright and
IP dispute. And so it raised the question of can
somebody sue somebody for allegedly like stealing their their intellectual
(06:26):
property on social media, which again I just find super interesting.
According to the fashion law site Gifford's, who commands over
half a million followers across platforms like Instagram, TikTok, and Amazon.
Sorefront allegeds that Shield systematically replicated her content and online
persona to mislead followers and unfairly compete in the influencer
marketing space central to gifford suit, where her claims that
(06:49):
she'll copied her copyright protective images, mimics her appearance down
to specific physical traits such as a flower tattoo, and
appropriated the distinctive visual and stylistic elements associated with her brand.
So something I do find interesting about this is I
just I mean, I'm no attorney, but visually, when I
look at both women's content, there is definitely a lot
(07:12):
of aesthetic overlap happening. One of the women just recently
had a child, and I'm like, oh, well, luckily that
is one way to make like I was like, okay,
well that is a distinction because they're so otherwise, they're
so similar, And it makes me wonder like, in the
age of Instagram, is can anybody claim to sort of
be unique or distinct when so much of what gets
(07:35):
popular on Instagram because of algorithms is things that seem
like copies of each other, right, Like all of the
influencers use this specific pose. They're all they have a
specific manicure, a specific look. If both of these women
are sort of chilling for Amazon's Amazon clothing, like, it's
not surprising that they might have some overlap.
Speaker 2 (07:53):
But the case actually was.
Speaker 1 (07:56):
Essentially dropped with the influencer, the one accused of doing
the copying having to pay nothing. Is what seems like
that case might have answered that question of like, can
you copyright an aesthetic? I'm not an attorney, but it
seems like from this lawsuit answer is perhaps no.
Speaker 3 (08:10):
Yeah. I think it's super interesting. And what the girl
that was suing said in her TikTok was that she
had to drop the case because it was getting incredibly expensive.
So had she had the funds, like, there may have
been a continued pursuit at least on some some frame here,
which I find interesting in and of itself. And I
(08:32):
do think that in the creator sphere too, money gets
in the way, Right we think about how many people
would actually sue if they had the money to do
so in terms of somebody copying them. We see it
with small brands all the time with big brands we have.
I don't know if you saw this, but there is
in the city this vintage brand called Kissing Cowboys and
(08:54):
it's run by also like an influencer, one of those
people that's just like at the cross section of many things.
And Brandy Melville allegedly copied her logo, which she went
on to say that is not just like a Canva
logo or something like that. Her and her sister handmade
the logo and the brand has copy and pasted the
(09:14):
exact logo onto a T shirt that they have god
knows how many skews of that they're selling in their stores.
And so you have a small creator that has a
small brand that doesn't have the funds most likely to
soon pursue this. So I think like that also is
particularly interesting to me because if creators had more monetary means,
(09:36):
and same with small brands, where would that really lie,
Like what would the decision process actually look like? For
that It just leaves me with more questions. But I
do think in this particular case, the thing that gave
me the ooh, that's a little creepy and maybe it's
almost like a different angle is seeing the matching tattoos,
(09:57):
the blocking almost the intentionality of it, because you also
go into that other frame, right, Like, if you are
an influencer, you expect people to copy you, right, That's
part of the job is to push people to do
things that you're doing, or to lead them to an
idea to consider it. So where does that start and
where does that end? I think there definitely is some
(10:18):
gray area when the person is starting to get exactly
the same as body or and copying things at the
same timetable. Yes, uh, that's where I'm like, ooh.
Speaker 1 (10:27):
She alleges that she got she has a very distinctive
flower tattoo on her arm, and she says that this
influencer who was copying her got the same distinctive tattoo. Like,
first of all, that is commitment to the bit, that is,
you were really committing to copying this woman. But it
is uncanny and the fact that they're both specifically Amazon influencers,
they both do that Amazon storefront you know, hashtag Amazon finds.
(10:51):
She basically is arguing that she is intentionally trying to
confuse that what would be their audience market into like
associating the two, which I think is that's like a
very good point. So, reflecting on this, professor of law
and Media Faculey director of the Center for Law, Information
and Creativity Alexander Roberts wrote, surprising or not deeming any
of the Gifford's claims plausible for creators posting similar content
(11:13):
has serious implications thanks to the fact that influencer marketing
has become increasingly central to commerce and social media. Content
more broadly is built on trends and served to users
via algorithms, which are quick to push content related to
what users click and or linger on in their feeds.
Roberts asserts that intellectual property law has not traditionally protected
the way somebody styles their hair, makes up their face,
(11:34):
or decorates their home, whether or not those choices are
photographed and shared. And I do think it's sort of
I mean, had this case had had the influencer who
was making these these allegations been able to have the
funds to see this case through, it could have been
very different. But had it been, you know, had it
gone differently, this could have potentially like set an entire
(11:54):
new precedent when it comes to IP online, which I
just find fascinating.
Speaker 3 (11:58):
Totally, which I do think could be troubling for the
larger creator industry because sometimes you don't understand that, like
you used x y Z sound and it came from
this person and they created a trend off of it,
Like where's the origins? It's really hard to trace that
origin of a trend, you know, a concept or whatnot.
So struggle.
Speaker 5 (12:26):
Let's take a quick break at our back.
Speaker 1 (12:43):
Okay, so I have to talk about this story with
Real Page really quickly. I will say right off the top,
it doesn't really fit with the content that we usually do.
But Real Page is my personal like I think of
them as my personal enemy. So any time that I
got an opportunity to talk shit about that, I will
take it.
Speaker 2 (13:00):
So indulge me.
Speaker 1 (13:02):
So on the episode last week, we were talking about
this provision buried in Donald Trump's big, beautiful budget bill
that would ban states from enforcing laws on AI for
ten years.
Speaker 2 (13:15):
Did you hear about this?
Speaker 3 (13:16):
Oh yeah, yeah, States Rights? Oh yeah, miss exactly again,
Like okay, exactly.
Speaker 5 (13:24):
So.
Speaker 1 (13:25):
One of the kind of real world examples of how
this might impact people that I talked about when we
talked about that bill was how scumbag landlords are able
to use technology like real Page that allow landlords in
a whole area to work together, to coordinate to essentially
algorithmically determine how much they can raise your rents. Right,
So the government has called this price fixing, which I
(13:47):
happen to agree. So these are scumbags who make scumbag
technology for scumbag landlords, and as a lifelong renter, these
people are my biggest ops. Like, I cannot stand this.
I think this technology is ruining cities. I hate it.
Twenty two, a report from Republica linked real Page to
rising rent prices across the United States, alleging that its
algorithm allows landlords to coordinate pricing, and the Department of
(14:09):
Justice in eight states sued real Page last year, claiming
that it deprives renters like me of the benefits of
competition on apartment leasing terms. So, cities like Minneapolis, Jersey Cities,
and Francisco, Philadelphia and others have passed laws that are
meant to ban the use of this kind of algorithmic
AI based rent setting software, and other states have legislation
(14:30):
like this in the works. But if this budget bill passes,
that means that states can no longer enforce those kinds
of laws, which would be a reals win for scumbag
landlords and the company that makes this rent raising technology,
Real Page. So now senators are basically asking did Real
Page pay to get that provision buried in the budget bill?
Speaker 2 (14:52):
This week?
Speaker 1 (14:52):
Lawmakers say that they believe Real Page might have spent
millions of dollars pushing for this provision. In a letter
that they sent to Real Page a CEO, five Democratic
Senators Warren, Sanders, klobashar Booker, and Smith are asking for
more information about Real Page's potential involvement. The letter reads,
in light of this, we seek information on Real Page's
lobbying efforts and how the Republican's reconciliation provision would help
(15:15):
the bottom line of Real Page and other large corporations,
allowing them to take advantage of consumers. So I just
have to shout that out. First of all, just the
feeling of reading senators doing something that I feel like
does have a real impact on the lives of constituents,
Like it's just nice to be like, oh yeah, governing,
(15:35):
like somebody's out there doing it for sure.
Speaker 3 (15:39):
No, literally, especially in these chaotic times. It sort of
like is anyone doing anything, I think is often how
people feel. Sometimes there's the feeling versus the reality. You know,
did somebody actually go and go to gov track and
see what bills were introduced? Right? There's always sort of
like that extra of things. But I think, for me,
when I think about this, to see that they're actually
(16:01):
looking at a tech piece of something, right, Like I
think that we understandably look at Congress at large and go,
oh my gosh, these old dudes that have no idea
how to even open Instagram unless like some assistant helps them.
You know, how could they ever understand AI and its
possible implications and everything else under that soun? So I
(16:23):
think to even see that they're reacting, I mean, I'm
not surprised that Elizabeth Warren was reacting given sort of
her expertise, but elsephies and I hope that it continues
to expand because AI it honestly, it freaks me the
fuck out, Like I really, really like, am not a fan.
I know it's integrated into so many things that we do,
(16:45):
even the things that we don't even realize, but I
just don't like it. I don't think it's a thing
for good. Oftentimes, I really think, especially given the hands
that it's coming out of, that it's a thing for
bad and that if we are one habit, that it
needs bar rails, and the fact that this bill would
get rid of those guardrails right and make it. I mean,
(17:06):
if thinking about like how much AI has come into
our lives, like literally the last year and a half,
how much it's taken over the conversation. I mean, a
year and a half ago, I wouldn't even imagine that
we were talking about AI as much as we are.
So think about what that looks like on an accelerated
page for ten years, right?
Speaker 1 (17:23):
And I mean I think it comes down to, like,
do people want a world where the not just the
rent in your building, but the rent in your entire
city goes up so the ceo of real pages scumbag
ahi price fixing tool can make more money?
Speaker 2 (17:41):
Is that a better world for you? Probably? You and me?
Probably not.
Speaker 1 (17:45):
Definitely a better world for the CEO of this technology
company who is making money from it and profiting from it.
Speaker 2 (17:50):
But yeah, I would just argue like I ken't.
Speaker 1 (17:54):
I mean, I could talk all day the fact that
this provision would so clearly benefit fit the owners of
this tech company and harm just regular people trying to
make rent. You and me renters like people who you know,
just regular folks. I think is pretty clear, and so yeah,
I'm with you. I think when you think.
Speaker 2 (18:13):
About the values of.
Speaker 1 (18:16):
The people who make technology like this, I tend to
think it's not going to be technology that makes our
lives better. I think it's really poised to exploit.
Speaker 3 (18:25):
And certainly this budget bill is not the way to
solve any of those potential problems. We look at this
budget bill too, and how it's going to increase the
deficit astronomically over the next ten years. So you pair
that with crashing essentially in a long way the economy
by knocking out the biggest you know, entry level segment
(18:47):
for white collar jobs. What like, what are we doing?
Speaker 2 (18:52):
Yeah, it's it's not good.
Speaker 1 (18:54):
And I have to say, like another story that I
was reading this week that kind of speaks to the
way that advancements of technology like AI really are hurting
us and particularly the most marginalized among us. This story
that this came out from Full for Media, Like shout
out to them for being on this. We use their
reporting on the show all the time because they are
experts over there. So the vast network of surveillance that
(19:17):
we have in this country, like license plate reader surveillance tools,
well all of that is being used to track people
who are suspected of getting abortions. This is not shocking,
like It's what abortion advocates and like tech privacy advocates
have been warning is going to happen, and it's happening.
So four h four Media found that authorities in Texas
performed a nationwide search of over eighty three thousand automatic
(19:39):
license plate reader cameras while looking for a woman that
they said had self administered an abortion, including looking at
cameras in states where abortion is legal, such as Illinois
and Washington. So this technology, this license plate reader camera
technology is made by a company called Flock, and it
is usually marketed as being able to track and stop
carjackings or like fine missing people, but it can also
(20:02):
be used to surveil people suspected of getting abortion care
across state lines. So the Texas sheriff in this case says, oh,
we were not trying to surveil her for having gotten
an abortion. On May night, an officer from the Johnson
County Sheriff's Office in Texas searched flop cameras and gave
the reason for why he was searching these cameras as.
(20:22):
Quote had an abortion search for female according to multiple
sets of data. So whenever officers are searching flop cameras,
they're required to provide a reason for doing so, and
so they don't need a court order or a warrant.
But like, that was the reason that he stated that
female had an abortion. So he claims that the only
reason he did this is because, oh, the woman's family
(20:43):
had contacted us and that they were worried. He says,
her family was worried that she was going to bleed
to death, and we were trying to find her to
get her to a hospital. We weren't trying to block
her from leaving the state or whatever to get an abortion.
It was absolutely about her safety. So I got my
start an abortion advocacy. So it's something I know a
bit about. I have a lot of questions about this
because when you're talking about like a self managed abortion,
(21:04):
typically you're talking about a pill based abortion, which is
incredibly safe. While some bleeding is common, the risk of
complications and death is very very low, And so I'm
curious why they would think that this woman was at
risk of bleeding to death from having what I assume
is a pill based abortion. And even if they were
worried about her safety, that doesn't mean you can't be
(21:26):
criminalized for it. For for Media spoke to Elizabeth Ling,
a senior counsel for IF When How, a reproductive rights
group that runs a reproductive legal rights hotline, who pointed
out that many of the criminal cases that they have
seen of people being criminalized for trying to get an
abortion originate after somebody close to that person reports it.
Speaker 2 (21:43):
To the police.
Speaker 1 (21:43):
They say that a research report published by the groups
found that about a quarter of adult cases were reported
to law enforcement by acquaintances entrusted with information like friends, parents,
or intimate partners. So even if they were like, oh,
we were just worried about her, that is not a
situation that would protect her from being criminalized, especially in
a state like Texas, where abortion is essentially most in
(22:05):
most cases criminalized already.
Speaker 3 (22:07):
A thousand recent and I just I also just question
the way it's presented right that the family's first reaction
would be to call the cops, right help. I can't
think of a situation where if I thought somebody was
in need of medical care, that my first call would
be to the cops.
Speaker 1 (22:28):
Right when I when I know that I am in Texas,
a place where that call could potentially wind with her
behind bars.
Speaker 3 (22:35):
Totally also like to give like a slight benefit of
the doubt. I do think that there are so many
people in this country, in every state, that have no
idea what's going on with abortion laws. This has not
crossed their desk, Like I know, it seems like one
of those things like how could it not have? But
I as we've seen with also other cases too, people
(22:57):
are really really unaware until oftentimes they are in a
situation either miscarrying, trying to get their own care, whatever
it may be, that they are realizing, oh shoot, like
this is not what I realized our reality was. So like, yes,
I could understand a family not understanding that they are
actually putting their family member in legal jeopardy, But I
(23:21):
still just find that being their first call a little strange.
Not like the EMTs, you know, or not going after
trying to find your relative yourself. It just seems strange.
It's not the behavior that I would think most people
would have. Perhaps I'm in the minority on that, But
(23:43):
I just find it odd and I find it worrying too,
that this is what we're using this tech for, which
is again all the worries are back. It's like, if
we're going to have certain technology available, we need to
be very specific on how we're using that, and there
needs to be guidelines, laws, regulations on how that is
(24:05):
actually able to be used.
Speaker 1 (24:06):
The organization Privacy International actually just came out with a
new report looking at how much very sensitive data that
apps like period trackers are collecting and sharing and how
that data can be potentially dangerous in a landscape where
abortion is increasingly criminalized. So the news is kind of
mixed and that their report showed that apps have gotten
better with their privacy policies, but the landscape has actually
(24:29):
gotten worse, and so the bar needs to be higher
when it comes to privacy. To your point about this
is what we're using this technology for, I think that
technology and AI makes things that were not uncommon in
the abortion fight just much easier to scale. Like back
in the day, you know, I was doing abortion advocacy
(24:49):
work for a really long time. Back in the day,
it used to be that anti abortion advocates would like
stand in parking lots and manually write down the license
place of people who drove into clinics.
Speaker 2 (24:59):
And then they were incredibly well coordinated.
Speaker 1 (25:01):
And I can tell you from like personal experience that
they essentially it felt like the police were on their side, right,
and so like that was just from my experience, So
like they they would. They were these people who were
like incredibly dedicated to busying themselves of the business of others,
and it took a lot of work and coordination and
like in real life, you know, coming out to stand
(25:22):
outside of clinics and things like that. But now companies
like blog are selling the ability to do that at scale,
maybe without even having to like leave your home with
this technology. And it's an incredibly terrifying harbingerk of how
technology will be used to like further surveil and criminalize
both the people who need abortions and the people who
(25:42):
help them, like people who you know in Texas, I
believe it's even criminalized to like support or help somebody
who was looking for an abortion, right And so it's
just a really scary vision of where we might be
heading when it comes to the use of this technology
to further criminalize people from getting abortions.
Speaker 3 (26:00):
Yeah, I don't think we're headed in a great spot,
and I will be curious to see if any Blue
states react legislatively on anything like this. I haven't seen
anything cross my desk so far, but I would be
curious to see if anything in terms of data sharing,
uh sort of pops up in the next few months
(26:23):
for anyone that's continuing to be in session, or also.
Speaker 2 (26:25):
Next year me too.
Speaker 1 (26:32):
More after a quick break, let's get right back into it.
So I want to switch gears a little bit. So
I am I am not. I don't use dating apps.
I'm not on the market. However, I'm so fascinated by that.
(26:55):
But when they make changes to like entice users like,
I'm always, like very curious what those changes are. While
Tinder is testing letting users set a hype preference. So
Tinder is testing out this feature that let's paid subscribers
add hype preferences to their profile.
Speaker 2 (27:11):
So it will not be a hard filter.
Speaker 1 (27:13):
Rather, the setting will indicate a preference, and that means
it won't actually block or exclude profiles with that hype preference,
but that hype preference, if they add it will inform
what kind of recommendations you see when swiping. So this
is one of the ways that we already know that
dating apps inform people's preferences. As tech Crunch puts it,
it is not uncommon to come across profiles where women
(27:34):
state they're only looking for matches who are at least
six feet tall, for instance, even if in real life
they would be more flexible about this requirement. So I
will say we've talked a bit about this on the
podcast before. It's not just about height. Like we did
an episode with sociologist and author of the book Not
My Type, Automating Sexual Racism and Online Dating, doctor apral Williams,
(27:54):
and she has an entire body of research about this when.
Speaker 2 (27:56):
It comes to race.
Speaker 1 (27:58):
She found in her research that dating apps oftentimes are
not just like reflecting our biases, but they're actually reinforcing
our biases. And it makes me wonder if dating apps
are also doing this thing, the same thing with height.
Speaker 2 (28:11):
And I actually did wanna.
Speaker 1 (28:13):
I don't want to put anybody on the spot here,
but I think we have a short I mean you you,
I don't want to put words in your mouth. How
how do you, Mike producer Mike how do you identify
how do you identify?
Speaker 6 (28:26):
No, No, I'm a shorter guy. I'm a shorter guy.
I am five six and change. I will own that.
It's it's who I am. And you know, a few
years ago, I was on the dating apps, and yeah,
there were a lot of women who had put in
their profiles, you know, only men six feet are above
or whatever their arbitrary threshold was. And that's fine, Like
(28:50):
I do, I have zero interest in dating a woman
who would put that in her dating profile. I wish
her luck. I think there's a lot of women out
there who probably would prefer a taller guy, but maybe
they're open to a shorter guy. Great, that's you know,
a more appropriate woman for me. Uh so that's fine.
Speaker 2 (29:11):
I I don't know.
Speaker 6 (29:13):
I feel it's like there's already a bit of an
auto filter. I would put my height in my profile
just to like skip that whole song and dance. So
I don't know that this is going to be like
the magic bullet that saves Tinder, but uh, you know,
let him try.
Speaker 3 (29:33):
Yeah, I don't think so. Because also for hinge, you
automatically have your height on there, right, so you can
pay it to filter that I believe, I mean, you're
not going to find me paying for a dating app
like that is ridiculous in my opinion, Sorry, just like
like I can't but it is already a thing, So
(29:54):
I don't know how they're so late that's feature, Like
I just I feel like it's like we're going back
to I don't know, twenty fifteen, Like where have you
been in terms of tender.
Speaker 1 (30:06):
So this is what I think too, And this is
what I found like almost a little offensive about the story.
In the tech Crunch piece on this, they say the
company may hope if the addition of a height setting
could encourage more women to use and pay for the app,
which tends to be heavily dominated by men in both
the US and internationally. So this, to me, I think,
really says it all. Like it seems to suggest that
(30:26):
women like, what women really want is the ability to
read people out by height, and that's what women are
willing to pay for, when in reality, I think the
reason why dating apps are in trouble and that like
women don't want to use them is because they're not
serving up women or really anybody good experiences.
Speaker 2 (30:43):
And they're not.
Speaker 1 (30:43):
Serving up good experiences for anybody men either, And so
the people designing these apps, like if that's what they
think will meaningfully woo women into paying for Tinder Premium.
I think that perhaps that gives us insight into like
why women are not paying for Tinder Premium or using
these apps at all, because I just don't think that
they really are functionally understanding what it is that women want,
(31:05):
or like how to give women dating experiences that.
Speaker 2 (31:07):
Don't feel so bad.
Speaker 1 (31:09):
Also, I have to say, Mike, when we were talking
about this before we got on the mic, you were like, oh, well,
we have data that suggests that men are just lying
about their heights on dating profiles, right, like, it wouldn't
even work.
Speaker 6 (31:22):
It's true, you're both women. So I'm curious if you've
ever experienced this phenomenon of a man lying.
Speaker 3 (31:29):
First of all, right, like just cutt and clips. But
second to that, I mean, you know, like when you
go on Pinge, for example, and the guy says five
to ten that he's five eight, you have to subtract
two inches from any any height.
Speaker 1 (31:44):
Sammy, it is so funny that you say this because
according to Okcupan, who published this data, they say that
that is exactly the amount that men are adding, so
they're they're suspiciously adding two inches taller to their right.
So the man says five eight, you got to take
two off of that because that's the lie. And what
I feel like is like, okay, one inch, I'll give
(32:06):
you that two.
Speaker 2 (32:08):
So that's a little much.
Speaker 3 (32:10):
They think that they can get away with it if
they're wearing a hat. They're like, oh, let's see, it's
like that hat, like I sto's the extra extra.
Speaker 1 (32:20):
The show's a platform boot at a stow top hat?
Speaker 2 (32:25):
Is what is the Abraham Lincoln hats? Yeah, like like
a top hat.
Speaker 6 (32:29):
Yeah, and then he just has to wear that for
the rest of the relationship, like he wears it to bed,
he wears it to work.
Speaker 3 (32:36):
Honestly, at least he'd have a sense of humor. You'd
be like, wow, bringing something into the table Live Vestinel.
Speaker 2 (32:44):
Yeah.
Speaker 1 (32:44):
As a tall woman, so I'm five eleven and a
half in any kind of shoe, I'm six feet.
Speaker 2 (32:51):
When I the short period that I was on.
Speaker 1 (32:54):
Dating apps, I did put my height in my profile
because I mean, I I mean maybe I'm I feel
like that sort of you know, height wars stuff is
like a little overblown and I never found anybody who
had an issue with my height, but Mike, similar to you,
I didn't want to be on a date with somebody
who would like not want to be with someone who
was six feet tall. But again, I do think it
(33:16):
like it just suggests to me that dating the people
who make dating apps are just not really thinking too
hard or thoughtfully about the kinds of experience as they're
trying to curate for people. And I think that that
is the root of the problem as to why they're
in trouble, these little gimmicks about height or whatever.
Speaker 2 (33:33):
Not only Samy, as.
Speaker 1 (33:35):
You said, are they like out of Night out of
twenty fifteen or something.
Speaker 2 (33:38):
They're very late.
Speaker 1 (33:39):
But I just don't think there what is going to
save these platforms. I think they need to really be
thinking about, like how to improve experiences in a more
meaningful way totally.
Speaker 3 (33:47):
And I think there's this idea out there. I don't
know if this is technologically accurate or not, but people
feel like they're being served matches that aren't accurate to
them or just absolutely trash. So and then if they
pay for the better, you know, the upgraded premium version
whatever it's called that, then they're going to be provided
(34:08):
better matches, and so that paywall to better matches is
also making people being like this isn't worth it, because
then if you do pay it isn't necessarily better and
it's a whole nother like level of gatekeeping, Like no
one's getting anything more necessarily from it. We're just creating
more barriers when I think people in the dating sphere
(34:29):
already feel like there's tons of barriers to meeting anyone.
So I think there's that element. And I also think
even too, I mean, like to your point about these
people that are making these ups, like maybe they should
look at the prompts available. Some of them are like
if you were doing orientation freshman year at college and
you're all looking around the room like this cannot be.
(34:52):
This is so cringe. That's what these things are. So
I just I think are used. The people in charge
of the tech are not asking the right questions, and
they don't know their audience. They think they know their audience,
they think that a subset of data has told them, oh,
this is who our audience is, but they clearly haven't
spoken with those people because I think and that seems
(35:16):
to be a missing link across the whole space, whether
it's tech, whether it's the consumer space, whether it's politics.
I feel as though people just aren't having conversations with
their actual target market. Yeah, because if they did, the
feedback would be so much different, because I'm having those
conversations and I could give you like fifteen different points
that are so different from what they are sharing in
(35:37):
their big reports.
Speaker 2 (35:39):
Yeah, and I think you really nailed it.
Speaker 1 (35:41):
Like, we are so much more complex than like a
set of data that a tech company has on us,
and I think that they are trying to make decisions
about platform experiences based on narrowing us down into a
very specific set of data that they have on us.
Speaker 2 (35:57):
I think that's exactly right totally.
Speaker 3 (35:58):
Because like, even think about someone's past, like dating experience,
that wouldn't be a part of that data necessarily. Say
they had their they were dating their high school sweetheart
for ten years and they were a short king and
then they had a terrible breakup, right, and then this
is their first entree onto the dating apps, and so
(36:19):
they don't want to do anything with any short kings.
They're over it. So therefore it's going to inform their
hype preferences, which obviously the dating app wouldn't know because
this is their first time on the app and swiping,
and they're only going to think of it as, oh,
this person's a snob about twelve versus short not understanding
their past personal interpersonal relationships. Right, So, like there's so
many of those things that aren't captured that again, we're
(36:42):
just they're just missing it. They're just missing it.
Speaker 2 (36:45):
They're just missing it. That's a good way to put it.
Speaker 1 (36:48):
Okay, So I have to ask you about this Nancy
Mace report from Wired.
Speaker 2 (36:53):
It is a lot, I mean.
Speaker 1 (36:56):
South Carolina Republican Representative Nancy Mace is a lot herself.
Like there's all I mean, she's just there's a lot
going on with her. So, according to Wired, who spoke
to a bunch of her former staffers, Nancy Mays would
frequently monitor her image on social media, even going so
far as to create bots to comment across social media
(37:18):
in support of her and in this attempt to boost
her image, and she allegedly also asked her staff to
create fake profiles on social media in order to keep
an eye on the discourse about her and generally boost
her image online. According to one of her former staffers.
They told Wired we had to make multiple accounts, burner accounts,
and go and reply to comments saying things that were
(37:39):
not true, even on Reddit forums. We were congressional staff
and there were actual things we could have been doing
to help our constituents. I would say that it was
at least a weekly comment, if not daily. People Magazine
called her staff or her aufice today for comment, and
they kind of like made a little joke. They were like, oh, well,
apparently we're too busy creating cop bots and making comments
(38:00):
to answer your query.
Speaker 2 (38:02):
But this is just my opinion.
Speaker 1 (38:04):
I absolutely can see her doing this, Like, I absolutely
see it. I have no trouble believing this about her, none.
Speaker 3 (38:09):
Oh none. And also her past staff clearly hates her,
oh my god, yeah, in an inch of her life,
like absolutely hates her guts because obviously it's the hell
of the story. But it doesn't it just doesn't help anyone.
So you really have to be quite bitter to get
yourself in a mess like this talking about your old boss.
(38:31):
So the first thing that pops up. Second to that,
I really doubt that she is the only member of
Congress doing this, Like I don't know anyone personally doing this.
Let me just go on record saying okay, it's and
also second to that, I do wonder if it comes
into violation of campaign financeauce.
Speaker 2 (38:51):
That's a really good question, very good question.
Speaker 1 (38:54):
Yeah, I bet she's not the only person doing this.
But something tells me like, like, if you're gonna do
this and get your staff mixed up to them, you
better be treating that staff pretty good. And the fact
that your staff is this leaky and that they're like
running to wired to be like she makes us do this.
Speaker 2 (39:10):
I think you're right. It really says something about the
dynamic on her staff.
Speaker 1 (39:13):
And the report actually quoted a deposition from Wesley Donahue,
a South Carolina based campaign consultant who previously worked closely
with Mace's campaigns. Donna Hue told a court quote, she
programs her own bots, sets up Twitter burner accounts. This
is the kind of thing she does. She sits all
night on the couch and programs bots because she's very,
very computer savvy. She controls her own voter database, she
(39:34):
programs a lot of her own website. She programs Facebook
bots and Instagram bots and Twitter bots. It's what she
does for fun. And let me tell you what a
like grim projection of what an evening at Nancy Mace's
house must look like. A courage us on the couch
having a great time controlling her bot army like pretty grim.
Speaker 3 (39:54):
No, for sure, I mean, holy narcissism.
Speaker 1 (39:57):
And it's so wild to think that when Nancy Mays
first came on the scene many many many years ago,
was because she was the first woman trying to get
into the citadel, like she has had a very long
kind of life in public, and yeah, it just seems
like something's going on. And subsequently I have no trouble
believing this report from why I guess I'll just put it.
Speaker 3 (40:18):
That way totally and again I would a thousand percent
guarantee she's not the only one doing it, maybe at
the obsessive capacity in which she is clearly doing it
aka the night at Nancy Macey's right.
Speaker 5 (40:30):
But.
Speaker 3 (40:31):
I mean, if you spend as we both do, you know,
being chronically online people, you start to understand what is
at what isn't a bot, what their sort of characteristics are,
and how they pop up occasionally. I bully want to
be honest, a nice little bott and bully just to
see them prove themselves as a bot can be a
(40:52):
little fun asking them insane questions and sort of seeing
what they spin out. It is satisfying. But I think
just seeing sort of you know, knowing that that is
the way of the world. That there's so many bots,
I mean, there are also so many people paying for
these bots and creating them on both sides of the aisle,
so and beyond that, beyond our borders. So yeah, we're
(41:15):
in a war of clips and we're in a war
of bots. But I think the more that people know
the media landscape like that is the reality that we
are in to be aware when you go to a
comment section that what you are seeing might not be reality,
Right that comment on TikTok that has fifty thousand likes
(41:35):
from user one, two, three, four, five, six, seven, like that,
that particular comment is trying to curate a reaction, and
supposed to it's also being utilized to curate more, you know,
not just reactions in terms of the likes to it,
but the comments and then the videos off of can
you believe this comment section? So I think at least
if we're in a position where people understand what they're
(41:58):
actually in taking and what risks there are to seeing
a comment section. I think we're at least a little
bit better at a spot. But hey, that's another area
where I say, hey, where's the regulation on this bot situation?
Because it's crazy and people really believe what they read
(42:19):
no matter what, so and even it doesn't matter how
smart the person is, Like, I think we're all guilty
of it too, being like, wait, oh my god, all
the vibe chuck on this comment section. People are really
feeling xyz way and right. It's just how you feel
about something. And we as people, I think, oftentimes want
to see the best in what we're taking in and
(42:39):
the best of intentions, but oftentimes that's not the case.
Speaker 1 (42:42):
So Sammy, you are somebody who has a ton of
experience with this. This is very much in your wheelhouse.
You've had ten years in media, pr, comms, politics, and
you know, we were talking earlier about how people are
always just sort of wanting to pick your brain and
get advice about how these things shape so much of
our discourse. So you were telling me about this new
thing that you're doing called office hours.
Speaker 3 (43:01):
Yeah, so office hours really born out of getting loads
and loads of questions as to how on earth to
deal with this new media sphere right from podcasting which
we're doing right now, to social to newsletters to everything
else in between, how to actually activate in this space
and basically office hours. You can book it for an hour,
(43:23):
you can book it for a half hour. You get
to pick my brain on strategy development across the board.
It can really be anything across that larger comm space again,
whether it's marketing, PR, strategy, social, understanding what you're seeing online,
how to actually develop a strategy, how to react, what works,
what doesn't, everything in that bucket and more. You're able
(43:45):
to book an office hour and chat with me about
it and we can figure out bust path forward for
what you're working on. And I do this on the
political end of things, but I also do it in
the consumer space too, So everybody's welcome at office hours,
and you can a book of time with your link
in bios situation.
Speaker 1 (44:04):
It is always a classic, so well put the link
for folks to join in the bio. But it is
really important like I used to do, like disinfo trainings
and media trainings, and I think especially right now when
as you said, the media landscape is shipping, so quickly
that it can be really hard for folks to get
a handle on it, like it's hard for me sometimes.
And so I think that's a valuable service that you're offering.
(44:25):
And I actually know somebody who I think could probably be.
Speaker 2 (44:29):
Like, take good use of this.
Speaker 1 (44:31):
And that is the last story that I want to
want to talk about, which is that readers who read
fantasy author Lena McDonald's book called Dark Hollow Academy Year
two noticed an interesting editing note embedded into chapter three
that suggested that she might have used AI to write
this book. The note said, like Sandwiched in between the
(44:51):
regular dialogue of the book, there was a note that said,
I've rewritten the passage to align more with Jabree's style,
which features more tension, gritty undertones, and raw emotional subtext
beneath the supernatural elements. So not only was this author
using AI, allegedly she did not delete the AI note
(45:12):
that made an end of her published book, and she
was using AI to make her writing sound more like
another person's writing.
Speaker 2 (45:19):
When I tell you, I would simply.
Speaker 1 (45:20):
Have to die, like like I don't know if there
would be any coming back from this, Like I would
be so embarrassed.
Speaker 2 (45:26):
I would pack it up.
Speaker 3 (45:27):
No, literally, like you've never seen me again. I'd be
off on some island with no Wi Fi and just
maybe a nice little canoe like that would be truly all.
And it reminds me of I'm blinking on her name.
But she's a cookbook author and this wasn't her fault
at all. The editor unfortunately missed it. But there, I
guess for an image of her was a note left
(45:47):
in the book to like tone her arms more. Ah,
and all the copies went out like just horrendous. I
think it was I don't think it was her first
book at used. And again it wasn't her, It was,
you know, sort of the editor, which I still feel
badly for because obviously that was not their intent. But yikes, yikes,
(46:11):
And it makes me just think about how one of
the areas of expertise that we really have let lead
out our editors, right, people that catch these things, that
do the due diligence, that go back through things. We
look at technology and we go, oh, it's gonna it's
(46:31):
got us, It's fine. I can't delain how many times
I've controlled f on a spreadsheet and it's missed something, Okay.
So I just think that the the personal touch, the
human to human does get some of the things that
are missed, and that is just so wild and uncomfortable,
and I just, yeah, I don't know what one does.
(46:53):
I mean, I could probably think of a pr way
out of that one I have met, but yikes, that's
that's a long road. That's definitely not a quick fix,
I'll tell you that much.
Speaker 2 (47:03):
Yeah, Lena reached out to Sammy for some crisis comes.
Speaker 1 (47:10):
But it's also it's just a good reminder that, like,
there are so many reasons why people should be careful
about the way they use AI, particularly in their creative work.
You know, so many reasons why you should be If
you're gonna use AI, you need to really be careful.
Speaker 2 (47:23):
Uh.
Speaker 1 (47:24):
But the biggest one now is like you could be
horribly embarrassed all across the internet when your AI note
is published in your fucking book.
Speaker 2 (47:34):
My god, so bad, so bad.
Speaker 3 (47:37):
I just hope that if she does decide to write
another book, that AI as just in its existence right,
is integrated into the storyline. Oh yeah, that would be
like a way to partially solve it. I fel like,
actually it was any drag you guys caught it again.
(47:59):
I'd have to read the book. I really think if
that was a good strategy or not. But you know
there's something.
Speaker 1 (48:04):
This is why you're the pr master, Sammy. People, that
was actually quite masterful.
Speaker 3 (48:10):
Thank you, Thank you, Sammy.
Speaker 1 (48:12):
Thank you so much for being here and helping us
a break down all of these stories across the internet.
Where can folks keep up with all the things that
you are doing across the internet.
Speaker 3 (48:20):
Well, thank you so much for having me. This has
been so much fun. You can all tap in at
groolmigov dot com. You can find the newsletters there and
then for social Girl on the GUV and Girl on
the Gov the podcast.
Speaker 1 (48:33):
Check it out. It is a very useful newsletter. The
podcast is great. You all talk to like legitimately important people,
like elected officials and like have genuinely important conversations that
are impactful for everyday lives.
Speaker 2 (48:44):
I really really respect what y'all are doing.
Speaker 3 (48:46):
Thank you. Yeah, it's been crazy. Sometimes I forget that
we've interviewed some of these movers and shakers myself in
this you know, just that way that your career sometimes
moves are like oh yeah, I can't believe I did
x y Z. Thanks, but we've done it and there
are some definitely some interesting conversations on the podcast feed
and then also on the social we've been mini micin
(49:08):
it up. Okay, those mini mics are living ten lives.
Speaker 5 (49:12):
I'll tell you that much.
Speaker 2 (49:13):
Given them a workout.
Speaker 3 (49:14):
Well.
Speaker 1 (49:15):
Thank you so much for being here, and if folks
want to follow me, you can follow me on Instagram
at bridget Marie DC, on TikTok at bridget Marie DC,
and on YouTube at there Are No Girls on the Internet.
Thank you so much for listening. We will see you
on the Internet. If you're looking for ways to support
(49:35):
the show, check out our mark store at tegoty dot com,
slash store. Got a story about an interesting thing in tech,
or just want to say hi, You can reach us
at Hello at tegody dot com. You can also find
transcripts for today's episode at tengody dot com. There Are
No Girls on the Internet was created by me Bridget Time.
It's a production of iHeartRadio and Unboss Creative, edited by
Joey pat Jonathan Strickland is our executive producer. Tari Harrison
(49:59):
is our producer and sound engineer. Michael Amado is our
contributing producer. I'm your host, bridget Todd. If you want
to help us grow, rate and review us on Apple Podcasts.
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple podcast or wherever you get your podcasts