Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Cool Zone Media.
Speaker 2 (00:16):
Say It sixteen.
Speaker 3 (00:28):
Six six.
Speaker 2 (00:54):
Welcome to sixteenth minute, the show where we take in
notorious moment or main character of the Internet and remind
you that they're a person and so are you and
that sucks for everyone. And this week we're taking a
bit of an interlude, a breather, if you will. I
don't know if I've ever really, like, properly taken a breather.
Say it sounds like that scene from Midsomar when I'm
(01:17):
taking a breather?
Speaker 4 (01:19):
Is that relaxing? See, I'm relaxing.
Speaker 2 (01:22):
But occasionally, for sixteenth minute episodes to be the right length,
we will end up having to cut interviews that I
really love. In today's interview with Jason Kebler at four
or four Media is one of those. It touches on
a lot of stuff that's been on my mind, including
the reason that this episode is a little shorter than normal.
So we're gonna share these interviews that got cut, mostly
(01:43):
because I think they're instructive to what this show is about,
but also to ensure that the quality of this show
is staying consistent. And this interview actually motivated me to
want to talk about that a little more or at
least once, because the reality of making any reported digital
media right now is, in my opinion, even when you're
lucky to work with an incredible group of producers and
(02:05):
collaborators and are paid well to do it, as I
am very lucky to be, there is still a lot
of pressure to produce right just like make as much
as possible. But if you want to make stuff to
the top of your ability, at some point this is
going to be at odds, right, or in order to
make it not at odds, you'll have to give other
(02:26):
parts of your life up. Jason and I start by
talking about a rise in AI art, but I was
really pleasantly surprised that we sort of took this turn
into talking about artists and creators and journalists being forced
to compete and compare themselves to machines, because that ethos
to make as much as possible ties into all these
(02:48):
dystopian problems that are happening around US. Journalists are losing
jobs in massive numbers. The remaining journalists are not paid
enough to make up for the fired people's absence, but
that are expected to cover the same amount at the
same skill level, and the archive of all of those
journalists work is being disappeared from the Internet as if
(03:08):
the labor never even took place. And this is a
tiny thing, but I cannot stand that analog media collectors
were right all the time. All of my Blu Ray
boyfriends were right. I was wrong to bully them for it.
I talked about this in the first episode of sixteenth Minute.
You are listening to a future piece of lost media.
(03:29):
I believe that anyone who's making anything and distributing it online,
and it's a lot of us, are being encouraged to
churn out content at unprecedented rates and are being told
to accept that that labor can be deleted from existence
if the company that happens to be your mommy needs
a tax cut. And so with this pressure to create
(03:52):
while being told that this creation is disposable, it feels
like some have taken a very depressing message away. Why
even bother to make something that you would stand by
if it's just gonna disappear anyways. It's a wrongheaded and
embarrassing takeaway. But I genuinely think that this is part
of why there is such a plagiarism issue across media
(04:15):
right now. It's really frustrating it's no excuse for that behavior,
but maybe you've noticed it. The mindset of oh cool,
I'll just make some shit up or steal it from
someone else so i can post more, because that's what
I'm supposed to be doing. I, along with millions of others,
watched a YouTube doc from h Bomber Guy late last
(04:36):
year called Plagiarism and YouTube about the proliferation of plagiarism
on YouTube that spotlights a few damning specific examples. And
all of these examples were creators, which, God, what a
nebulous term. I wish we needed a new word, crea tours.
All of these examples were creators who had ripped off
(04:56):
others without credit and the interest of producing more faster
that still felt like it could be made by them.
And sure, this problem has a lot of heads to
cut off. Another issue is that crediting practices online seem
to be worse than ever. I mean, I know early
in my podcasting I wasn't encouraged or as beholden to
(05:16):
provide detailed citation, and it produces problems like what we
talked about in the Black TikTok Strike episode. The TikTok
algorithm encourages people to replicate dances, but doesn't encourage them
to credit the choreographers. And that's how you get Charlie
Demilio with a million dollar brand deal performing other people's
work that she couldn't tell you the source of. But
(05:37):
the trickier thing here is the volume of content. H
Bomber Guy is an incredibly thoughtful and talented person. There's
no one like him, and he and his producer usually
make about one long video a year, a feature length,
carefully researched and cited video a year, with an eye
to visuals, to pacing, to all of these things. That's
(05:58):
why they're so good. They're taking the time. I'm not
introducing a new idea here, and in an equitable landscape,
everyone would have that time. But what I'm seeing for
newer creators is that the baseline encouragement is to make
as much as possible in a way that's not just
unsustainable if you stop hanging out with your friends in
a way that is literally impossible. And I'm guilty of this.
(06:22):
I'll sometimes see the output of one of my peers
and be like.
Speaker 4 (06:26):
I need to do more. I'm not doing enough.
Speaker 2 (06:29):
The problem is I don't know how that would be
possible without completely creating my life or starting to take
shortcuts making the thing a little bit worse. And I
feel tighthested and like I'm bitching and I'm fine. All
I'm trying to get at here is that I don't
know how to reconcile making the kind of quality things
that I want to make with the amount that people
(06:52):
are encouraged to make. I don't know how I would
do that unless I were a machine. And that's kind
of what I think we're being encouraged to do. And
how are being encouraged to think of ourselves produce things
at the rate of a machine while remaining a discernible, vulnerable,
advertisement reading savant. When I think about how I would
(07:12):
like to be able to work, it's embarrassing, right, Like
my ideal form is someone who can produce like a
machine while maintaining the just relatable enough veneer of nice
legs and gum disease. But more often than not those
things are in direct conflict. It's not possible.
Speaker 4 (07:29):
It sucks.
Speaker 2 (07:29):
It sucks that we're being encouraged to produce like machines,
But I'd be lying if I didn't say that.
Speaker 4 (07:36):
At many points in my career.
Speaker 2 (07:38):
That wasn't a fantasy of what I wish I were
capable of, which is you it's not punk rock at all.
What I meant to say was every so often, sixteenth
Minute is going to release a shorter episode of an
interview that I wasn't able to get into the final
cut of an episode. I am very guilty of assigning
myself too much homework, which my collaborator and producers Sophie
(08:01):
and Ian can certainly attest to.
Speaker 4 (08:03):
I'm sorry, guys.
Speaker 2 (08:05):
Today we are talking to Jason Kepler, whose work has
been invaluable to me in making this show, and I've
been a follower of his work for years before. We
originally recorded this for last week's episode about the sixty
five foot hot Dog in Times Square, because as I
was observing the online reception to this incredible public work
of art that was the sixty five foot hot Dog,
(08:27):
there was an even more prominent strain of art discourse
making the rounds in my little algorithm. So with that
in mind, here's a little setup of what we're dropping
into AI images. I know you've heard of them, sometimes
they're even called art. And in the weeks that the
Times Square hot Dog was this big fixation Facebook and
LinkedIn in particular, We're pushing AI artworks into algorithms at very,
(08:52):
very high rates, and tech journalists were starting to call bullshit.
This story was on my radar for a number of reasons.
As a listener of fellow cools on media podcast Plug
Better offline with ed Zitron Plug, I already knew that
Google implementing AI into its search engine had all but
blown the entire website up, and as a follower of
Jason Kebler at four h four Media, who was also
(09:13):
a former editor of Motherboard and a critical source in
our Boston Slide Coop episode, I started seeing reports of
Facebook being flooded with fake AI images. Now I've been
seeing aiart go viral for years. People had a lot
of fun with those early generators right and seeing the
polar Express movie looking results, But as time went on,
(09:34):
many were finding AI images harder and harder to identify
as AI images. Recently, there was a lot of talk
about the AI generated all eyes on Rafa graphic that
went around on Instagram in lieu of actual on the
ground images of Gaza, of which there were plenty that
would have done far more to demonstrate the severity and
(09:57):
the violence that is taking place as Palace Indians are
continually targeted and genocided by Israel and then over on
platforms that younger people kind of don't use anymore. Facebook
and LinkedIn. AI was flooding their algorithms and the images
taking off were unbelievably fucked up looking. Let's throw in
(10:18):
some music, why not? A series of AI photos of
a child in what appears to be an African rural
village with these weird art projects, A sea through motorcycle,
a portie made of popsicle sticks and bone Chillingly, a
wooden bust of Mark Zuckerberg, a hummingbird with gigantic hairy testicles,
(10:41):
with the caption ninety nine years of luck, you will
never lack money for your trip in travels, Peter Griffin
sharing food with orphans in Africa.
Speaker 4 (10:51):
A dead eyed Jesus Christ with.
Speaker 2 (10:52):
The crown of thorns, surrounded by two beautiful light attendants,
one of whom is giving him a gigantic cheeseburger. Captioned
Beautiful Cabin Crew Scarlett Johansson hashtag boom challenge. So many
of them have This caption could not tell you why
with a gun to my head. These are kind of funny, right,
but on a long enough scrolling timeline, it gets unbelievably depressing.
(11:16):
And if they're in your feed, they're preventing you from
seeing something useful or at least made by a person.
And all of this made the conversation around a sixty
five foot hot dog with intent made by people, and
the importance of public art and public art funding feel
all the more relevant. I asked about the AI art
(11:50):
that is attempting to take over our social media feeds
with Jason Kebler. Here's our talk.
Speaker 3 (11:56):
I'm Jason Kebler. I'm the co host of the four
Media podcast.
Speaker 4 (12:01):
First of all, thank you so much for doing such
incredible work.
Speaker 2 (12:06):
Your research into the slide cop was integral into getting
us closer to cracking that story.
Speaker 4 (12:12):
No one has done it still, but.
Speaker 3 (12:14):
I love slide Cop.
Speaker 1 (12:15):
Yeah.
Speaker 3 (12:15):
We got that Foya response back like the day before
we launched our new website, and I was like, oh,
we have to lead with that.
Speaker 2 (12:23):
During the same period of time of this public artwork
and publicly funded artwork that I'm talking about, there is
another conversation going on about the rise of AI art
that is also being like flooded into algorithms in certain
areas of the Internet.
Speaker 4 (12:41):
How did you first How did this first get on
your radar?
Speaker 3 (12:45):
Yeah, so, I mean I've been aware of the fact
that AI art exists for a very long time. It's like, yeah,
you know, we sort of covered the rise of like
Dolly and mid Journey, and people just sort of like
generating images of like very often anime girls and then
like futuristic city scapes. It's like people either gravitate to
(13:08):
like hot women or like sci fi future.
Speaker 4 (13:13):
Yeah, like weird screen shaver images.
Speaker 3 (13:17):
Yeah, like when these things launch. But I think that
I first started seeing this as like a thing that
was just going to flood the Internet when I started
seeing these images like on Facebook all the time. Yeah,
And it started with, if you can believe it, like
a guy in the UK who sculpts dogs out of
(13:41):
logs using a chainsaw. He does like chainsaw art. So
it's like if you want a statue of your dog,
he will make one for you out of like a
giant piece of wood. Incredible, And this is like very expensive.
It caused like thousands of dollars and it takes him
like weeks to do and he like documents the entire
process as he does this and is like a social
(14:04):
media influencer alongside of selling these like wooden dogs.
Speaker 2 (14:08):
That used to be one of my favorite areas of
YouTube was like weird art process videos, but at least
it was proof that it happened right.
Speaker 3 (14:16):
And it's like, I mean, these are things like you
see him using the chainsaw. You see him taking like
a gigantic block or cylinder of wood and turning it
into like a German shepherd, and you know these this
is like a popular type of content online. It's like
it was going viral on its own. But then you know,
one of our readers actually sent it to me because
(14:40):
he had seen like fifty different versions of the same image,
except in every image the dog was slightly different or
the man was slightly different. The guy who does this
is like white, like twenty five year old from the
United Kingdom, and in some of the images this man
had become a one. Sometimes it was like he had
(15:02):
to go tee. Other times the guy was Latino. Sometimes
the dog was a German shepherd. Sometimes it was like
a Saint Bernard, other times it was like a poodle.
It was like but it wasn't very obviously AI. It's
like it was still like kind of realistic. So if
you were just like scrolling through, it's like, oh wow,
like lots of people are carving dogs these days, and
(15:24):
they were like going viral on Facebook. So I was like, oh,
like what's going on here? And then I found this
community of people on Facebook who were like documenting the
spread of this type of content where basically like a
viral image was being run through an AI and then
was going viral itself even though that new image was
(15:45):
not real.
Speaker 2 (15:46):
Who benefits from these images being out there?
Speaker 3 (15:51):
I mean, I think it's just like kind of run
of the mill spam. It's like Facebook has been filled
with spam for years at this point, and there are
people who are just like building pages that have tons
and tons of followers, and they're either selling the pages
or they like find some way of turning it into
either a scam or just like a spam opportunity. It's
(16:13):
like a lot of them link off of Facebook and
then you click on the link and you get like
nine hundred pop ups and they're all just like Google ads.
So you have people like collecting, you know, their pennies
from each of these little clicks. A lot of the
people who are doing these are in like Vietnam, Cambodia, Brazil, Nigeria,
(16:34):
and we know that because Facebook started making people disclose
like where they are located. Interesting if you run like
one of these pages, yeah, which was something that they
did like after like the fake news scare of twenty sixteen.
Speaker 4 (16:50):
Okay, well seems to have really helped.
Speaker 3 (16:53):
They surely, Yeah, yeah, But anyways, it's like a lot
of this stuff is happening in places where the cost
of living is a little bit lower, and so I
don't think that they're making tons of money, but they
are clearly making enough money that it's like beneficial for them.
Speaker 2 (17:07):
To do it. They also seem to be sort of
like sticky inside of certain algorithms. I know you've written
about Facebook in particular and now also LinkedIn more recently.
From what you can gather, are these images more easily
sucked into algorithms than stuff that's real?
Speaker 3 (17:23):
I think that some of it is so weird and
bizarre now that you kind of like can't help but
either click on it or like engage with it in
some way, even if you know that it's AI, it's
and so you know I'm a reporter. I've been seeking
this stuff out for a while, but once I started
clicking on it, that's all I got. I get like
tons and tons and tons of AI images. And I
(17:45):
think that Facebook hasn't been a super relevant platform for
people who are pretty online for a while. And so, yeah,
people don't like use Facebook that much. It's still has
billions of users, actually, but among journalists and comedians and
people who just like use the internet a lot, I
don't think Facebook is that popular anymore. And so I
(18:06):
know that some people like went back to Facebook when
they saw that this weird stuff was showing up there.
And then you click on like one image and that
ends up being like all that you see. You mentioned
the algorithm, And a really interesting thing that's happened is
TikTok was like the biggest threat to Facebook for a
long time. And by Facebook, I mean like meta the company,
because it was like destroying Instagram for a moment. That's
(18:29):
why they introduced reels. And like the big thing that
TikTok has is the for you page, which is the
algorithm where you just like open it and you can
scroll forever and you'll just see like stuff that you
that the app thinks that you'll like. And that's like
not something that Instagram and Facebook had for a long time.
It's something where like you opened it and you would
(18:50):
see there was an algorithm, but it was mostly from
pages that you followed or pages that people you liked
or were friends with engaged with. And Facebook launched this
thing about a year ago called Recommended for You, and
it's basically the TikTok algorithm, but for Facebook. I know
that was a long wind up, but basically what has
(19:10):
happened is Facebook has started showing people things that are
popular on Facebook, regardless of whether anyone you know has
like ever engaged with it, or has anything to do
with it, or like cares about it at all. And
that's how a lot of this stuff is getting recommended
because not only are these people like posting the ai images,
(19:30):
they then have like an apparatus to like comment, engage
with it and get it going in the algorithm for
lack of a better term, and then you end up
seeing it.
Speaker 2 (19:39):
I know it's not the exact same thing, but like
when there was that significant Twitter algorithm shift and all
of a sudden, there are main characters introduced that probably
wouldn't have been possible ten years before because you never
would have seen them.
Speaker 3 (19:51):
I've seen two things. One, there's like a bunch of
people on Facebook who can't tell that this stuff is AI.
There's like tons and tons of bots. And then there's
people don't know that it's AI. Then there's people who
do know that it's AI but are just like engaging
with it because it's weird. And then there's like a
whole other phenomenon that is happening where there are people
(20:13):
who have heard about AI art and know that aiart exists,
but don't want to be fooled by it. These are
people like a lot of my parents' friends, they've like
heard about AI and they like, I'm not going to
fall for it, because I like started asking people like
have you seen AI art on Facebook? Like are you
getting it on your Facebook? I like posted that on
(20:34):
Facebook and then ask people to send me examples of
AI art that showed up in their feed, and so
many people send me real stuff that they're like, this
art's too good, like I can't possibly be real, And
(21:00):
so I think this other thing is happening too, where
people like don't want to be seen as being an
idiot who fell for AI art, and so they have
their guard up to the point where like, that's fake,
that's fake. That's fake.
Speaker 2 (21:10):
I saw a thread from an artist who was beside himself.
You could plug your own art into an AI driven
analysis bot that would tell you what the percentage of
likelihood that your own art was AI, and sometimes you
would get like a real piece of work that was
like this is seventy five percent likely to be AI.
And so there's actual artists who are beside themselves having
(21:33):
to defend what their art is is real.
Speaker 3 (21:36):
AI art is like a black box algorithm, like you
type a prompton and something comes out the other end,
and you don't know exactly how that was made. And
then there are all these AI art detectors that themselves
are AI and themselves are like this black box that
we don't know how they work. And you know, some
of them are designed by people who are trying their
(21:58):
best and like trying to like help determine like is
this real? Is it's not? But then there's also a
bunch of like snake oil people out there making the
same thing. It's just calling real stuff bullshit because we
don't know how it works either. It's just like a
fake program that is also AI that is not accurate
(22:18):
or trustworthy either.
Speaker 4 (22:19):
That's interesting that.
Speaker 2 (22:20):
Yeah, you've found that people are becoming so paranoid that
they're mistakenly preventing themselves from engaging with something that's real.
Probably an obvious question who stands to be harmed by
if this continues and trending in the same direction.
Speaker 3 (22:37):
Yeah, I think that there is a backlash to this stuff,
Like I don't think that people like it for the
most part, like the masses. I think that the masses
want to like engage with real art made by real
humans that time and effort and care was put into.
So I do think that what's happening is very bad
(22:59):
and scary. Like I have some optimism from the backlash
that we've seen where people are going to like specifically
seek out and support human journalists, human artists, human musicians. However,
I think that we're in a very critical moment now
where every single platform is like injecting some version of
(23:21):
AI into their platform and is being taken over by
AI spam to some extent. It's like Google has its
you know, AI discovery stuff, like Facebook is a mess,
Instagram has all these AI influencers on them now TikTok
is the same. I think discoverability on the internet is
what is most at risk. It's like, are we going
(23:43):
to be able to find human art When an algorithm
can make one thousand paintings in one second and a
human takes a week to paint a picture. It's just
like humans can't compete with the output that's happening, And
so I worry that human creativity and human stuff is
(24:04):
just going to get drowned out, rightfully.
Speaker 2 (24:06):
So in the last year, a lot of talk about
you know, I hate they have to be like human creators,
but human creators who are making stuff, but they're trying
to please the algorithm. So they're making a lot of
stuff really fast. There's like a rise in plagiarism. There's
a rise in just like I need to have something out.
I don't care if it's good, I don't care if
it's anything. It's the only way to sustain and connecting
(24:27):
those two ideas of like, yes, there's been a huge
rise in finding out that someone who presents as very
genuine doesn't actually do their homework, outsources it somewhere, steals shit,
and that's awful, but I feel like it is in
response to something where you know, it's a very cynical
way of being, Like how else do I compete when
you know there's an AI technology that can, at least
(24:48):
on its face, think it does what I do and
reach a larger.
Speaker 4 (24:52):
Audience, Like what the fuck do you do?
Speaker 1 (24:54):
Uh?
Speaker 4 (24:54):
Not plagiarize other people?
Speaker 2 (24:56):
But like it's just if I feel like it just
introduces all of these like ethical problems among human artists.
Speaker 3 (25:05):
I approach this from, you know, the perspective of like
a journalist and a writer and someone who publishes blogs
on the Internet, and I think that what happened. I
think it's very similar for comedians and people who make
like TikTok videos and YouTubers and artists, where the Internet
is kind of a slot machine like a lottery. It's
(25:26):
like you spend a lot of time creating something and
then you publish it and either a bunch of people
look at it or like no one looks at it.
You either sort of like win the algorithmic lottery and
you get into the system that is like here's all
your retweets, here's your likes, here's your comments, and like
you're going to get millions of views on this thing,
or you're gonna get hundreds of thousands of music on
this thing, or you're gonna get like nothing. And there
(25:50):
have been so many times where I have worked very
hard on an article and talk to like a lot
of people, like interviewed a bunch of people, spend a
lot of time, I'm on it, painstakingly edited it, and
then published it and no one reads it, and I'm like, oh,
that sucked. But then like five minutes later, I will see,
(26:10):
like I'll do something that I don't care about that
much and it will be like a short, jokey blog
about whatever, and I publish it and it goes massively viral,
and I'm like, cool, wish I could have made that
happen to the thing that I cared about versus the
other thing. And so I think that's how you end
up with this phenomenon that you're describing, where you want
(26:31):
to take as many bites at the apple as possible,
and like humans start behaving, humans start like doing stuff
trying to reverse engineer what will work with an algorithm.
And I think that's what makes me so scared about
AI generated content, is that a human might get one
(26:51):
bite at the apple or five bites at the apple
if they work very quickly, and the AI can take
like a million swings at it because it can just
endlessly generate different iterations of it.
Speaker 2 (27:03):
Do you have advice for people who are combating this
machine in art specifically, but also just you know, if
your work is in constant contest with aiah.
Speaker 3 (27:15):
So I worked for a Vice for ten years, and
I'm proud of what we did there, and you know,
I don't think that what we did was bad, but
it was also part of like a big company where
we were like publishing lots of articles. We were you know,
trying to get people to read our stuff, so on
and so forth. And now you know, I started a
(27:38):
company with three of my former colleagues called four or
four Media, and it's we're not going to have the
scale that Vice had, Like we're not going to have
as many people read our stuff. But what we have
done and what sort of like gives me hope, is
we have started explaining to people that in our work,
(28:00):
like hello, we are human beings who are sitting at
a computer typing up our little posts and putting them
on our little website. And this takes a lot of time,
it takes a lot of effort, and here's like how
we do it, and here's why you should support that.
We've tried to like give a peek behind the curtain
that sort of like documents the process of like what
(28:22):
reporting is. It's not that Vice never did that, but
it's like many of these like really big outlets have
like a view from nowhere where the journalist isn't like
inserting themselves into the work and the blogs just like
up here on the Internet. And I think that you
can take that and extend it to everything. It's like
(28:44):
if you're a photographer and there's just like you're just
posting photo photo, photo, photo photo, that those photos might
be great or the art that you're making might be great.
I think it's important to document like how much time
and effort and work goes into making the things that
you're making, and like explaining to people why it is
(29:06):
what differentiates you from someone who's shitting out fifty million
photos from mid journey or like writing a book on
chat GPT in three seconds and publishing forty of them
on Amazon. You know, like I think that that doesn't
mean that that we're going to win that we the
humans are going to like win versus a sort of
(29:28):
like tech industry and tech platforms that are not super
friendly to humans. Yeah, but I think that that is
like the the path forward. You want the work to
stand for itself. But I think you also kind of
need to be like.
Speaker 2 (29:42):
Hey, yeah, working hard over here, like God, that's like
that is very pragmatic, Like I mean, essentially you have
to show here's the log, here is my change thaw,
here is how it becomes a German Shepherd's statue. I
shared the optimism that it does seem that most people
want what they consume, whether it's art, whether it's journalism,
(30:04):
they don't want a robot writing it. They trust the
person more than they trust the robot.
Speaker 4 (30:09):
Yeah.
Speaker 2 (30:09):
Now there's also this additional Well, the best practice is
maybe to just like occasionally remind people that you're not
a robot. I think what you're doing is much smarter
because I think sometimes I like internalize weird shame about
my own process.
Speaker 3 (30:23):
Well, I think that's been drilled into artists and been
drilled into journalists too. It's like I went to journalism school,
which is extremely embarrassing, Like in retrospect, it's like I
was being trained to like go into a dead industry, but.
Speaker 4 (30:37):
Like majored in radio. It's a thing that they.
Speaker 3 (30:41):
Teach you is like objectivity, like you're not the story,
you're not a part of it, blah blah blah. And
it's like if you do that, nothing differentiates you from
the zillions of other people who are doing the same thing.
And I think that this is actually something that like
YouTubers and influencers have been very good at. It's like
I hate to hand it to Logan Paul or whatever,
(31:02):
but it's like he has people who specifically seek him
out and you can. I just said Logan Paul because
it's the first name that came to my head. But
it's like all of these influencers, all these comedians who
kind of give you like a peek behind the curtain
and like have a personality to them, also seem to
be doing better than like the institution that is like
(31:25):
we're just going to publish a bunch of kind of
like wire service articles. That said, it's like such an
extra thing that everyone now needs to do. It's like
you don't not only do you have to like do
the work, but then you have to like explain the work,
and then you have to like foster a community and
be beholden to that community, and it is pretty exhausting,
and it can come at the expense of hanging out
(31:47):
with your friends, responding to text messages from your mom
like yeah.
Speaker 4 (31:53):
Because you're you're busy curiating something. That just a reminder
I'm a person.
Speaker 2 (31:59):
Thanks so much to Jason, and you can subscribe to
four or four Media and check out their podcast in
the description. Their work is so fucking good and thorough
and they're reporting on tech in the Internet unlike anyone else.
And also thanks to Jason for just encouraging me to
share about my own process. I feel embarrassed, I feel naked.
It will never happen again, but there it is. In
(32:20):
all seriousness. This conversation was really useful for me. It
was a reminder that it's not reboten and it shouldn't
be discouraged to occasionally just say, hey, making something you
can stand by takes a while. When you hear these
shorter episodes, that's what's happening. I want to make sure
that you're listening to information that is good and thoughtful,
and I want to make sure that I don't accidentally
(32:42):
kill myself doing it, So thank you for listening. Try
not to work until three in the morning, and text
your mom back. We'll be back next week.
Speaker 4 (32:50):
Glad You're here.
Speaker 2 (32:55):
Sixteenth Minute is a production of Cool Zone Media and iHeartRadio.
Speaker 4 (33:00):
It is written, posted, and produced by me Jamie Loftus.
Speaker 2 (33:03):
Our executive producers are Sophie Lichterman and Robert Evans Llamas
and Ian Johnson is our supervising producer and our editor.
Speaker 4 (33:11):
Our theme song is.
Speaker 2 (33:12):
By Sat thirteen and Pet. Shout outs to our dog
producer Anderson my Kat's Flee and Casper and by Pet
Rockbert who will outlive us all Bye.