Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:13):
From Kaleidoscope and iHeart podcasts. This is tech Stuff. I'm
as Volocian and I'm Cara Price. Today we're going to
get into the headlines this week, including an anti tech
rally in New York City and a look at how
workslop is affecting the workplace. Then, on Chat to Me.
Speaker 2 (00:30):
I had students take personality tests in Spanish and then
used AI to turn their personality data into unique paintings.
Speaker 1 (00:39):
All of that on the Weekend Tech. It's Friday, October third, Hello, Cara, hi.
Speaker 3 (00:47):
As.
Speaker 1 (00:48):
One of the great pleasures of recording this podcast with
you is getting to hear you chew ice right next
to the microphone before we start. I hope your mouth
is nice and cool.
Speaker 3 (00:58):
I'm actually number I'm like talking like I just got
a tongue pier.
Speaker 1 (01:03):
My ears are numb as well.
Speaker 3 (01:06):
Well. Speaking of being in the studio together, it's your
turn in the hot seat now to make fun. Are
you even wearing shoes today? Oh?
Speaker 1 (01:15):
It's a deep cut. I actually am right now, but
you obviously noticed that I do have an unconscious habit
of slipping them off.
Speaker 3 (01:22):
I do, and it's disgusting.
Speaker 1 (01:24):
It is well, I consciously try and keep them on.
In fact, but I did get in trouble. I think
I probably told you the story many years ago when
I was a consultant working on the Google account. I
once unconsciously slipped them off in the Google office and
was padding around Google's actually off, and my boss took
me his side and said, listen, I was, You're a
(01:45):
great guy, but if you ever take your shoes off
and a fine meeting again, you're fired, which is totally
fair enough.
Speaker 3 (01:52):
It's such a power booth.
Speaker 1 (01:53):
It was. I was so I whacked out that I
was not even aware of it.
Speaker 3 (01:59):
You had six hundred coffees probably.
Speaker 1 (02:01):
By then, exactly. But that's why I was particularly delighted
to be ahead of the curve on this. It's now
the norm in Silicon Valley, apparently, according to the Fortune
article you share to shame me in our full producers
and host Slack channel with the headline the hottest workplace
policy at startups right now no shoes So I'm curious,
(02:21):
kra how would you feel if we had a shoes
off policy at Cardoscope.
Speaker 3 (02:25):
The thing is is that I am so obsessed with
dumbing down the workplace to make workaholism.
Speaker 1 (02:34):
Feel comfortable exactly, and that's exactly what this policy is
all about, exactly. So that's interesting. There was actually a
picture on the on the Fortune article of a shoe rack.
Speaker 3 (02:43):
You see that It was like a Japanese restaurant.
Speaker 1 (02:46):
Yeah, it looks like somewhere between a Japanese restaurant and
a high school, right. I mean, it's like the locker
full of like dirty trainers. I mean, it's really the
aesthetics of the new Silicon Vallee.
Speaker 3 (02:55):
Weird, but it's a sort of sinister trojan horse in
the way that you know they used to. I mean,
I was just reading in the article all of these
ways that companies make offices more appealing, and I think
going in and being like, you know what, I'm walking
right back into my apartment to go to work is like.
Speaker 1 (03:10):
Gotcha free haircuts, but shoes off.
Speaker 3 (03:14):
That's right.
Speaker 1 (03:15):
Well, I'm obviously obsessed by the workplace, sadly, so I
couldn't resist this story in the Harvard Business Review, which
went around the internet and had the headline AI generated
work slop is destroying productivity.
Speaker 3 (03:29):
When was the last time you heard the word slop?
Speaker 1 (03:31):
I hear the word slop all the time now, Yeah.
Speaker 3 (03:33):
It's weird. It's like AI has brought slop back into
the conversation when it was never that, Like slop was
not a thing.
Speaker 1 (03:41):
You're so right that the twenty twenty six word of
the year is slop. I do agree. I think I
first I first started hearing at the beginning of this
year AI slop, AI slop a stop, And now it's
like everywhere so work slop. Well, guess what it is.
Speaker 3 (03:53):
The bottom of the barrel.
Speaker 1 (03:56):
Was AI generated crap that looks a little bit like
real work. And I was pretty interested in this because
I am not only the host of text stuff with
you you are, I'm also the co founder of a
podcast network called Kaleidoscope, which is the network that produces
this podcast, and in my CEO hat, obviously in response
(04:17):
to investors and culture and whatever else, I'm always encouraging
our team. So how can we use AI? How can
we be part of the air evolution? How can we
not be left behind? On the other hand, when people
send me work there's clearly been made my AI, I'm
absolutely furious, Which is I think the paradox at the
heart of this. Harv a business review study which found
(04:38):
the following quote Approximately half of the people we surveyed
viewed colleagues who sent WORKSLOP as less creative, less capable,
and less reliable than they did before receiving the output.
Forty two percent saw them as less trustworthy, and thirty
seven percent thought that the colleague was less intelligent afterwards.
Speaker 3 (04:56):
But if you're already sort of not in love with
your CAR colleagues, now there's a new reason to think
that your colleagues are slacking off.
Speaker 1 (05:05):
Essentially, Yeah, I mean it's interesting, right. Not all AI
assisted work is slop. Obviously is work slot, but there's
also the sort of lazy thing where you say, oh, well,
let me have AI generated the report and not think.
And I think that's what this article is really about.
Speaker 3 (05:23):
I looked at this article too. There's this idea of
like the distinction between a pilot and a passenger. I
can't exactly wrap my head around that.
Speaker 1 (05:31):
Well, what do you imagine it is?
Speaker 3 (05:33):
Well, a pilot flies the plane.
Speaker 1 (05:35):
Yeah, what does the passenger do?
Speaker 3 (05:37):
Rides the plane?
Speaker 1 (05:39):
The plane sits in the back.
Speaker 3 (05:41):
Are you a pilot or a passenger?
Speaker 1 (05:43):
I'm not as much as a pilot as I would
like to be with AI, honestly, but the distinction they're
making is pilots are people who actually take these new
genera AI tools and harness them to do better work,
and passengers are people who use them either because their
boss has told them to or because they're trying to
short at doing their work. So essentially, you know, passengers
(06:03):
churn out AI slop, and pilots, at least according to
this study, use AI to be better workers.
Speaker 3 (06:09):
This actually reminds me of one of my recent favorite
party term drops, which is cognitive offloading, which you know,
is when we offload critical thinking to generative AI and
just how much that puts the person who's using AI
at a deficit. And I think that slop, I guess,
is a product of cognitive offloading.
Speaker 1 (06:27):
It is, and interestingly, not just cognitive offloading on behalf
of the employee, but also in many cases cognitive offloading
on behalf of the boss. And a lot of these
mandates whereas you're basically you have to use AI or
use AI the persons I will, okay, but if you
don't set a proper playbook for how to use it
(06:49):
and some training for how to use it smartly, no surprise,
garbage in, garbage out. I actually thought this story appealed
to me not just because of the you know, work
place context, but also the macro context and these swirling
questions about whether all this investment in AI will actually
pay off to transform the economy and the financial times
(07:10):
at a great piece where they actually used AI but
in an intriguing way to do a kind of meta
analysis of SEC filings of different companies and how they're
using AI. The takeaway was quote, the biggest US listed
companies keep talking about AI, but other than fear of
missing out, few appear to be able to describe how
(07:31):
the technology is changing their business for the better. Stick
with that for a moment.
Speaker 3 (07:35):
That's really crazy.
Speaker 1 (07:37):
On top of that, MIT put out a report in
July that found that ninety five percent of organizations are
getting zero return. What do you mean by that, Well,
I guess the benefits are not being realized in the
real world by most people. The HBr thing goes on
to talk about how, based on self reporting, forty one
percent of employees say they're dealing with AI slop and
(07:58):
each instance of ailop cost them almost two hours to
clean up whatever the slop is. So have a business
review goes on to estimate that each incident of workslop
creates what they call an invisible tax of one hundred
and eighty six dollars per month per employee. Now, I'm
going to do some math here. Please do For a
company that say ten thousand people, if forty one percent
(08:22):
of those people are dealing with work slop, and that
forty one percent is wasting one hundred and eighty six
dollars per month to fix the slop, that comes out
to over nine million dollars of loss productivity per year
for the company if they have ten thousand employees.
Speaker 3 (08:35):
That's a lot of money to fix the slop. Actually,
do you think it is going to stop people from
integrating AI tools or stop founders like yourself?
Speaker 1 (08:43):
Now? Definitely not. Because it's exciting and it has a
tremendous promise and it could change the world. At the
same time, there is this deep irony here, right, So
it may not really work right now, but if it
does work and we don't invest in it, we're dead
in the water.
Speaker 3 (08:58):
All right. So moving away from the workplace, I want
to talk about a staggering figure I read in an
article from Business Insider this week, which is this five
hours of daily screen time equals fifteen years of life
by the age of seventy.
Speaker 1 (09:14):
Man, that's depressing. I think I'm closer to ten hours.
Speaker 3 (09:18):
You think you're on ten hours of screen?
Speaker 1 (09:19):
I just phone and computer to phone computer.
Speaker 3 (09:22):
Is Is that your life or is that another life
that could be lived?
Speaker 1 (09:25):
Definitely the latter.
Speaker 3 (09:26):
It's crazy. H Well, So this was in a Business
Insider article about a group of people who are kind
of reclaiming the term Luddite, who gathered at the Highline
in New York City to protest the use of screens.
Speaker 1 (09:43):
Essentially, yeah, I just you mentioned reclaiming the term luddite.
Luddite is used as a smear or has been used
as a smear to describe people who have like a
aversion to technology. But I think the word has been
rehistoricized this year and people have done work on the
origin story the Luddites who were brave, forward looking protesters
(10:03):
who at the beginning of the Industrial Revolution were willing
to sacrifice their lives to protest against the mechanization of
society and how it was making people jobless and in
some case go hungry and in some case be killed
by machines. So it is interesting seeing this work come
back into vogue.
Speaker 3 (10:20):
I think in this case it has a lot for
these people who were gathering. I don't know how much
it had to do with the state of work, as
it had to do with how much time we are
wasting on our screens and what the human cost of
wasting that time is. Business Insider sent someone to the
rally and basically she described what she saw there, so
(10:40):
in the article she actually said a number of people
were dressed up, and lots of them more colorful hats
in the shape of cones, meant to symbolize the down
to earth, humble garden gnome.
Speaker 1 (10:51):
Interesting lots of critique about the tech industry. Do you
think that dressing as a gnome will make the lords
of the universe? Can the boots?
Speaker 3 (11:01):
I don't think quake in their boots necessarily. I think
there's a sort of tongue in cheek aspect to this rally.
The hats were one example of that. There's also the
name of the event, which I loved, scathing hatred of
information technology and the passionate hemorrhaging of our neoliberal experience, which,
as you may guess, is an acronym for shit phone.
Speaker 1 (11:23):
I don't think I've used the words neoliberals since I
was in college. For that, I've never used Neil's the
core of that message.
Speaker 3 (11:30):
Yeah, you know. According to the Reporter and Business Insider,
the goal is to actually advocate for healthier relationships with
technology and quote in the article to take a conscious
step back from the social media apps, which I agree
with and need to implement in my own life. I
was actually telling you this before we started the show,
that there was this very serendipitous moment where a woman
(11:50):
came into the coffee shop where I was working, and
she said, can you please charge his phone? And I
look at it and it's a flip phone, and so
I'm thinking to myself, here's an opportunity to have a
conversation with her. And I said, you know, when did
you get this flip phone? And she said, I got
it a few months ago because I was noticing that
my memory was getting really bad, Like she just had
a hard time remembering things that she was reading. And
(12:12):
she said that since she started using this flip phone
that like her memory has come back within a matter
of weeks. I don't know exactly what that looks like
for her, but it was just this amazing moment where
I'm reading this piece about you know, techno ludditism, and
basically this woman comes in doing.
Speaker 1 (12:29):
Exactly that's living that life. Yeah, I mean, how old
was this woman?
Speaker 3 (12:33):
She must have been our age.
Speaker 1 (12:34):
Okay, yeah, but the protests and they're gen z their
gen z. And I think one of the trends which
is emerging is that the most anti tech generation is
the generation who grew up most on tech, which is
kind of interesting.
Speaker 3 (12:45):
I think that's because they never grew up without it,
and so I think there is a little bit of
a nostalgia for a time that they never had. It's
like me using vinyl, you know what I mean, Like,
I think it's cool because I didn't have to use it.
But Oz, this small but mighty rally wasn't the only
tech rejection I heard about over the weekend. Have you
actually been in the West fourth subway stop recently?
Speaker 1 (13:05):
I have?
Speaker 3 (13:06):
Have you seen the ads for friend dot com?
Speaker 1 (13:09):
No? Well I have seen them not in West Fourth Street.
I saw them in another subway stations. At these like
big white posters with a picture of a mysterious device
and a Dictionary definition of the word friend.
Speaker 3 (13:20):
Those are actually ads for a wearable device called Friend,
and the ad copy says things like I'll never bail
on dinner plans, you know, stuff like that. I originally
found this story that I'm telling you right now on Tumblr.
I actually didn't see it in the subway and there
was like a thread on tumbler full of pictures of
these ads that are completely covered in graffiti. So over
(13:43):
these friend ads people would write things like surveillance capitalism,
or get real friends, or stop profiting off of loneliness,
or friends is flesh and blood.
Speaker 1 (13:53):
In other words, people objected to this very much, company
claiming ownership of the word friend. That's exactly the product though,
is like a wearable AI.
Speaker 3 (14:02):
This is a wearable device that looks like a miniature
smart speaker that you wear around your neck. It listens
to you all day like a friend. It collects your
data and then you can talk with it via text
about what's going on my friend, Yeah text, I mean
this is a lot of my relationships are like this.
And according to Adweek, friend dot com spent over one
million dollars in subway advertising. The CEO of VI Schiffman
(14:26):
designed the creative himself and boasted on x that quote,
this is the largest ad campaign in New York subway
advertising history.
Speaker 1 (14:34):
I read that Avi Schiffman raised five million dollars for
Friend a year ago and spent, as you mentioned, one
million on this campaign. And he said to Adweek, I
don't have much money left. I spent it all on this.
It's a huge gamble. This makes me laugh because I
have a pretty strong feeling and you know, I'm waiting
(14:55):
to be corrected that Friend will not take off as
a product.
Speaker 3 (14:59):
I think think what's really funny about it is that
sometimes the marketing is the thing itself. And I think
it's really funny that people are engaging so much in
the defacing of these ads. And I think what's interesting
is that people know exactly what it is when they
look at the ad, and they also know that they
don't want to engage in using it. And so I
think the proof is in the putting in terms of
(15:20):
what people are writing on these ads. And again it
ties back to there's this growing trend to pull away
from just the onslaught of like digital interface that we
have in our lives. It's just I think gen Z
is really aware of how much products devices are co
(15:43):
opting their lives, and they don't. They're mad as hell
and they don't want to take it anymore.
Speaker 1 (15:48):
I think the big question for me really is, will
this anger and frustration there's coalescing around the role of
technology in our lives. Will it become a political force.
Will it be organized enough and durable enough to actually
drive any change in the way technology is used, the
way it's regulated. I mean, it's you know, it's really
(16:11):
it's really really hard to make society. What changes. Of course,
it did happen with cigarettes and sugar, ELTs and sugar,
and so, you know, we'll see after break. Some more headlines.
A top video game company is going private in a
fifty billion dollar deal. Italy becomes the first country in
the EU to pass its own AI regulations, and an
(16:33):
AI actress has agents lining up to sign her. Then
on chatting me, we learn Spanish. Stay with us. So, Kara,
you asked me at the beginning of the episode whether
I was wearing shoes.
Speaker 3 (16:52):
Yeah I did.
Speaker 1 (16:53):
I'm going to ask you now, do you play video games? Hell?
Yeah you do.
Speaker 3 (16:58):
I know I knew you didn't, and I was wondering
if you're going to ask me, what do you play
I'm a big Moriocart eight switch user.
Speaker 1 (17:05):
Oh interesting, yea.
Speaker 3 (17:06):
I fall asleep with it on me all the time
and it hits me in the face and then I
wake up.
Speaker 1 (17:11):
So as a resounding yes, yes, have you heard the
story about electronic arts this week?
Speaker 3 (17:15):
E A Sports It's in the game. If I say
that to my sister, she'd be like, yep, I know
exactly what you're talking about.
Speaker 1 (17:20):
So it's a massive video game company, both titled like Fifa,
Madden and the Sims, and it has agreed to go
private for the price of fifty five billion dollars. According
to The New York Times. If the deal goes through,
it will be the largest buyout of a publicly traded
company ever, not adjusting for inflation.
Speaker 3 (17:39):
So what happens to the shareholders?
Speaker 1 (17:41):
They get money two hundred and ten dollars per share,
which is a twenty five percent premium to the company's
share price as a public company.
Speaker 3 (17:52):
So who's paying for this? This is just crazy money.
Speaker 1 (17:55):
The group of investors led by Saudi Arabia's public investment fund,
Infinity Partners, which is Jared Kushner's private equity firm, and finally,
another firm Silver Lake, who also rumored to be part
of the TikTok deal.
Speaker 3 (18:09):
So that is the other huge deal that's said to
be decided in the coming months.
Speaker 1 (18:13):
I mean to me, these stories go side by side. Obviously,
Jared Kushner being involved in the EA deal and you
benefiting from his Trump connections. The TikTok deal has, you know,
Larry and David Ellison front and centered in a deal
being broken by the Trump administration and the Chinese government.
It's interesting to me, you know how US technology capitalism
(18:34):
is kind of more and more integrating with the state,
or at the very least the Trump the Trump Friends
and Family.
Speaker 3 (18:42):
Circle, Yes, Trump fn F as we'd like to call it.
So what's going to happen next in this electronic arts buyout.
Speaker 1 (18:49):
Well, the deal has to be approved by the government's
Committee on Foreign Investment, which reviews foreign buyouts for security concerns.
There are concerns about data, of course. I mean a
lot of people play video games in the US, which
means there's a lot of data that could be collected
by a foreign government.
Speaker 3 (19:07):
So do you look at app store charts?
Speaker 1 (19:10):
I look at podcast charts because.
Speaker 3 (19:11):
You want to see how your podcasts are doing because
you're a founder. That's why I actually look at the
App store charts sometimes because I think it's really interesting
what's trend Usually it's the same stuff that's trending. But
last week I actually noticed that there was this new
app that I'd never seen before called Neon Mobile at
the top of the chart.
Speaker 1 (19:27):
The top of the charts.
Speaker 3 (19:28):
Yes, it was at the top of the charts, and
then it disappeared or it stopped being used for a
privacy issue.
Speaker 1 (19:34):
Well why was it so popular in the first place?
Speaker 3 (19:36):
Because people were making money?
Speaker 1 (19:38):
Of course, Okay, back up a couple of steps. What
is neomobile and how are people making money from it?
Speaker 3 (19:43):
So the app basically offered to pay you money for
recordings and transcripts of your side of phone calls, which
was data that they would then sell to AI companies.
Speaker 1 (19:56):
So you'll bet I basically, as a Neon user, would
allow Neon to record my calls, my private calls, and
then sell them on to other companies.
Speaker 3 (20:06):
I when you say.
Speaker 1 (20:08):
It like that, how much people getting paid?
Speaker 3 (20:10):
What the website says about Neon is that they pay
thirty cents per minute if you're calling another Neon user,
so if you had Neon and I had Neon and
we were calling each other, they would pay us thirty
cents per minute. If I was calling a non Neon
app user, it's fifteen cents per minute. The max that
you can make is thirty dollars a day. Last week,
(20:31):
this free app had thousands of users and was downloaded
seventy five thousand times in one day. So selling your
personal data to train an AI is one thing, but
TechCrunch actually discovered something even more sinister, which is this
security flaw that allowed users to access the phone numbers,
call recordings, and transcripts of any other user on the app.
Speaker 1 (20:54):
So it wasn't just that you were consciously selling your
phone calls to companies. There was also a kind of
unconscious or there was actually flaw which meant that everything
you did while using Neon was exposed to everyone else
on Neon. Good old journalism, and that's why, as you mentioned,
it went offline.
Speaker 3 (21:12):
Yes for now, for now. So Neon actually intends to
come back with a vengeance, but they have not given
any indication of how long it will take to fix
the privacy flaw. The reason I sent this to our
producers is because I was going to sign up for it, Yeah,
because I'm a.
Speaker 1 (21:26):
Free thirty dollars a day not bad. That's like three
that would be thirty dollars a day, thirty cents a minute,
that's one thousand. That's a thousand, even outstrip your average
minutes per day on the phone.
Speaker 3 (21:41):
But I think that the average user relates to me
and sees paid for phone calls with no downside. Yeah,
I'm going to sign up for that.
Speaker 1 (21:48):
Why not say there is a downside?
Speaker 3 (21:50):
There was a huge downside. I think the takeaway here
is that be very cautious of when you download any
apps that are permissioned to record your phone calls and
access your contacts.
Speaker 1 (21:59):
One hundred percent. The takeaway for me is slightly different,
which is how you know, this surveillance society doesn't need
to be imposed, like we kind of volunteer for it ourselves.
And we're talking about Larry Ellison, but he said last
year citizens will be on their best behavior because we're
constantly recording and reporting everything that's going on.
Speaker 3 (22:20):
I think what is evidence here is that people have
gotten a little bit cynical about surveillance capitalism and they're like,
you know what.
Speaker 1 (22:27):
Pay me for it's happening anyway, pay me for that.
Speaker 3 (22:28):
That's exactly right.
Speaker 1 (22:29):
You know, elsewhere in the world, AI is actually being regulated,
and Italy passed the landmark law last week that addressed
a number of issues that have bubbled up on tech
stuff over the last few months.
Speaker 3 (22:42):
What are some of the issues that have bubbled.
Speaker 1 (22:44):
Up well, per Reuters, the new regulations in Italy include
restrictions on copyrighted content for AI driven text, required parental
consent for AI access for children under the age of fourteen, which,
by the way, I think is a huge one huge
in forced transparency around AI use in the workplace, Employees
will be required to inform workers when AI is being deployed,
(23:07):
and in the healthcare setting in particular, AI is allowed
to be used to assist in diagnosing patients, but doctors
will have to continue to make the final call and
also to inform their patients about how AI was used
in the diagnostic process.
Speaker 3 (23:22):
That last one actually reminds me of the conversation we
had with Robert Capps about AI in the workplace and
how transparency and accountability are going to be a huge
part of our jobs as human AI collaboration grows more popular.
Speaker 1 (23:34):
I think that's right. I mean, these are a lot
of these are guidelines for how to work in an
ethical way with AI, but there is also a section
devoted to AI crime and punishment, prison terms ranging from
one to five years for people who quote use technology
to cause harm such as generating deep fakes, and even
(23:57):
harsher penalties for those who use AI to commit crimes
like fraud and identity theft. So when I said that
this law addresses a lot of things we've talked about
on textuff this year, it actually really does. On the
other hand, without being too cynical, I'm not sure how
consequential a national law can be for an international technology.
Speaker 3 (24:16):
That's what I was just thinking.
Speaker 1 (24:17):
And yeah, this is, to be fair, built on top
of an EU AI law that was passed a few
months ago and it is being implemented in stages, and
EU as a whole, I think has this kind of
reputation of being the regulator in chief. But also it
doesn't have the political clout of China and the US.
So it'll be interesting to know whether you know, as these,
for example, these new Luddite movements emerge and gain political
(24:40):
clout here in the US, will they look to Europe
and say, O, wow, it's interesting they've actually done something
concrete and demand that legislators here do something similar or
will this be kind of shouting into the wind by
the block of countries who don't actually create on the
whole AI products, like in.
Speaker 3 (24:56):
Other words, good for Italy, But yeah, does it matter
for anyone else?
Speaker 1 (25:00):
Going to be any more relevant than the Colosseum.
Speaker 3 (25:03):
So the last story I want to bring you is
about one of my favorite subjects, which is AI in Hollywood,
which brings me to a very popular story this week
about Tillie.
Speaker 1 (25:12):
Norwood, who is Tillie Norwood.
Speaker 3 (25:15):
Tilly Norwood has been all over my Instagram feed all
week because I work, as you know, in show business
when I'm not doing this, and I noticed on my
Instagram feed all these people are making like loll Tilly Norwood, Loll,
Who's going to sign Tilly Norwood? And I googled Tillie
Norwood last week and it turns out that Tillie Norwood
(25:36):
is AI hold On.
Speaker 1 (25:38):
So you became aware of Tillie Norwood before you became
aware she was an AI character, That's correct. You were
like a star is born. I gotta know more.
Speaker 3 (25:46):
Well, Like people were posting photos of her, and I
was like, is she an actress? Am I supposed to know?
Or is she a new angeneu? And then I mean,
I'm kind of exaggerating, but like very quickly I googled
her and I realized, Oh, she's AI.
Speaker 1 (25:57):
Why had everyone suddenly come alive being fascinated by this
one AI generated figure actress? I mean, there's so many.
Speaker 3 (26:05):
There's a very specific reason. Actually, they're looking for an
agent who's going to sign her. They're treating, and so
a lot of the further kind of discourse on Instagram
was like making a joke like Lowell, who's Tilly going
to go with?
Speaker 1 (26:19):
And for?
Speaker 3 (26:20):
I might be wrong and this might have been something
that people talked about with other sort of AI generated characters,
but this is the first time I've seen an AI
actress be talked about in the way that other actresses
are talked about.
Speaker 1 (26:34):
What actor has been saying.
Speaker 3 (26:35):
They're basically saying that like this is bullshit, They're like,
this is this could literally come for our jobs. I
want to actually read you these two quotes from Tilly's creator,
Aileen van der Velden. She put out a statement on
Instagram saying Tilly Norwood is not a replacement for a
human being, but a creative work, a piece of art.
(26:56):
I see AI as not a replacement for people, but
as a new tool, a new paint brush. I think
it's also important to note that when Tilly debuted, Aileen
said quote she wants Tilly to be the next Scarlett
Johansson or Natalie Portman.
Speaker 1 (27:12):
Do you think Scott Johansson Natalie Portman identify more with
being a tool or paint brush. What do you think?
I don't think more as humans.
Speaker 3 (27:22):
I mean, I guess, I know. I know what she's
saying in terms of actors read lines, but it's a
I think that's blurring the lines between being a person
or technology in a way that people are uncomfortable with. Still,
SAG actually put out a statement and in that statement
they claim that Tilly was trained using performances from actors
without permission or compensation, which is actually out of compliance
(27:46):
with their new union contract. And girsh which on Entourage
Ari Gold would call a second tier agency. But I
don't agree with that. But I love gersh Gers has
great clients and Tilly won't be one of them.
Speaker 1 (28:04):
Well, that's funny. I wonder, I wonder if she will
end up getting signed by one of the major agencies.
Speaker 3 (28:09):
It's like the most lol thing to me that and
it's so agency. It's like, here's a product or a
commodity that is hot, right now, go chase after it.
I don't think anyone at the agencies is like overthinking
the fact that she's AI. I think actors are like,
wait a second, the agency that represents me is looking
to represent an AI. Is that are they condoning, you know,
(28:30):
movies being made with AI actors in them. And I
think that that's a fair question for actors to be asking.
I don't think it's outrageous.
Speaker 1 (28:57):
Okay, Kara is time for Chatting me nowment where we
hear from our listeners about how they're using AI in
their daily lives. And this week was interesting because we
got a submission and our producers weren't sure if the
submission was read by AI or a real person's voice,
and so we had a back and forth over email
(29:18):
with the person who submitted it, who said that he's
a teacher and therefore he may have a somewhat robotic
voice as a result, but also that his voice may
have been compressed by the voice recording software he's using,
So this is kind of a meta chat. To me,
the topic is not meta, but I wanted to bring
up the meta framework of this because this is kind
of the dominant question of our age.
Speaker 3 (29:39):
Is that person real? This week we heard from Sean.
His voice is real and he is a digital communications
lead at what we in the US call a technical college,
and part of Sean's job is to teach both his
students and his colleagues to incorporate AI responsibly.
Speaker 1 (29:55):
I'd be interested to get his take on the work
slop story we talked about. Well.
Speaker 3 (29:59):
I think in this this case, Sean is clearly a pilot,
not a passenger. Unlike me. He's been thinking very creatively
about how to harness AI for his work, and he
says his school wants to use AI to reduce workload
on the staff, enhance creativity, and support personalized learning.
Speaker 2 (30:16):
I had students take personality tests in Spanish and then
used AI to turn their personality data into unique paintings.
We then held an exhibition where other Spanish speakers could
engage with the students, who had to introduce the pieces
and answer questions.
Speaker 1 (30:34):
I love this sort of paradigm of human machine human
machine engagement and making which I've been in Sean's class.
Speaker 3 (30:41):
I also like that he's not just an AI booster,
that he actually sees the whole picture.
Speaker 2 (30:45):
While we're enthusiastic about AI, we're also careful about using
it responsibly. I have worked with senior leaders to issue
clear AI guidance for students and staff to outline how
these tools should and shouldn't be used. In practical terms,
this means AI is to be used as a supportive tool,
not as a way to do a teacher's or a
(31:06):
student's work for them. By cultivating these habits now, we're
hopefully preparing our students to use AI wisely in their
future workplaces because AI tools will be part of their careers.
As an educator, I find this incredibly exciting. Most importantly,
our students are learning with AI and not in fear
of it.
Speaker 1 (31:26):
So in case anyone was wondering what the opposite of
WORKSLOP is, we can thank Sean because we have our answer.
Speaker 3 (31:33):
Thank you Sean for submitting your chating.
Speaker 1 (31:34):
Me and please listeners, we want to hear more from you.
We want more chat and mes. Send your stories to
tech Stuff Podcast at gmail dot com.
Speaker 3 (31:47):
That's it for this week for tech Stuff.
Speaker 1 (31:49):
I'm care Price and I'm os Valoschin. This episode was
produced by Eliza Dennis, Melissa Slaughter, and Tyler Hill, who
was executive produced by me Caarra Price and Kate Osborne
for Kaleidoscope and Katria Novel for iHeart Podcasts. The Engineer
is Bihe Fraser and Jack Insley mix this episode Kyle
murder rodat Theme Song.
Speaker 3 (32:08):
Join us next Wednesday for Text Up the Story, when
we will share an in depth conversation with David Ignatius
all about spycraft and how the CIA is faring in
the technological age.
Speaker 1 (32:18):
And please do rate and review the show wherever you listen,
and send us your thoughts at tech stuff podcast at
gmail dot com. We love hearing from you.