Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
I was listening to a weird Al song today where
he name drops Kanye, which, yeah, taky because taky, but
it's not, it's just a really it's just weird catalog.
Speaker 2 (00:22):
Huh.
Speaker 1 (00:22):
My kids were like on the way to school today,
were like, is weird Al still alive?
Speaker 2 (00:28):
Weird Alse still alive? Ship? He's I have a oh
I should say I have a picture with him. He's
but a baby.
Speaker 1 (00:36):
He's also Kermit vibes. Oh big Kermit vibes. Yeah, damn,
I guess sixty he's sixty five. Weird aw man beautiful,
still got it though, Taki Taki?
Speaker 2 (00:50):
Where's he at now? Does he have like a not
like us one? Yet? He has been quiet and I
you know, where is he? Kendrick and Drake Beef? Did
he do a meet the grams? He actually dig you
for you? Yeah, I'm looking if you did. Pekaboo, that
(01:10):
would be fucking amazing. What are they talking about? They're
not talking about anything? What are they talking about? They're
talking about They're not talking about anything. Pika Boo. I
just I just freaked out a three month old Pika Boo.
Speaker 1 (01:30):
Hello the Internet, and welcome to Season three, eighty six,
episode two of turd Eylands. Guys.
Speaker 2 (01:37):
It's a production of iHeartRadio.
Speaker 1 (01:39):
It's a podcast where we take a deep dive into
America's shared consciousness.
Speaker 3 (01:44):
And it's Tuesday, April twenty ninth. Yeah, you know, we're
almost April. It's National Zipper Day.
Speaker 2 (01:54):
Shout out zippers, shout out just fastening, just just zipping up,
you know what I mean, shut up? The national it's
the Peace Rose Day, National Peace Rose Day, which is
name of a like a famous rose that like a
French horticulturalist had sent out like before the invasion of
France by the Nazis to try and preserve his unique rose.
And we caught the peace rows here in America and
(02:16):
also national like in terms of like maintaining peace.
Speaker 1 (02:21):
Yeah, was he he sent the roses to the Nazis
and was like, hey.
Speaker 2 (02:25):
Well so he was in France and he's like the
I guess the sort of anecdote goes is, because the
invasion was imminent, he sent cuttings to people in Italy, Turkey,
Germany's somehow at that point is like Germany and the
US to be like, hold on to these, hold on
to these, and yeah, it ended up being I think
(02:45):
given out at like one of the first meetings of
the UN. And it's also but all that to say,
the most important thing is National Shrimp Scampy Day. So
let's get locked in.
Speaker 1 (02:54):
Yeah, shout out to the letters YKK one your zippy.
Speaker 2 (02:58):
By the way, Yeah, very famous. I mean we might
have to be. That's a Japanese group, Japanese Zipper or ykks.
Oh yeah, sorry tariffs. You know what I mean. Our
zipper game about to get fucked up. That's like the
goaded zipper too. Like we don't loll get only.
Speaker 1 (03:15):
Wear pants that have USA on the zipper and those
always gets stale first. Yeah, most inopportune times. Yeah, oh man,
I can't believe we're almost through April. You know, I'm
gonna be the guy who keeps blaming the year be
like April twenty twenty five, bad one.
Speaker 2 (03:33):
Can't wait. May things are looking up because you'll be
with us.
Speaker 1 (03:37):
Yeah all right, my name is Jack O'Brien ak golf
till you dead, Golf golf till you dead.
Speaker 3 (03:45):
Boswell, roll Boswell, rollo boswill roll.
Speaker 2 (03:50):
On the green. That one courtesy of Sparkles.
Speaker 1 (03:53):
On the Discord, who said when Joel Monique said, Trump
can golf till he's dead. That song just started playing
in my mind and I couldn't get it out. That is,
by the way, not a threat. That is just an
invitation for him to have a lot of fun golfing
for the rest of his long, healthy life. He cannot
off till he's dead because he's so powerful.
Speaker 2 (04:14):
They need archives. The Neia Archives remix of that song
is really good. Yeah, a little drummond bass, Yeah yeah, yeah,
the why Why whyse not thekks Miles. I'm thrilled to
be joined by you, my co host, mister Miles. Thank
you so much for having me. Miles Gray, the Lord
of Lankersham aka the Showgun with No Gun, Experimental Black
(04:37):
and Knees visual artist, your boy Kusamah shout out to
Uh there was. It was Japanese America or Japanese Heritage
Night at Dodger Stadium last night, so you know, it
was very excited about that also because there was there's
unique merch that I'm trying to put my hands on. Yeah,
you know what i mean, You know what I mean,
You know what I mean. The game Uh no, no, no,
(04:57):
I did not make it to the game because her
magic You surprised me with tickets to see the Cowboy
Carter Tour. Wow, I was not able to go, not
that I even had tickets to the Dodger game. But yeah, yeah, yeah,
did that, did.
Speaker 1 (05:11):
That And we will get your review tomorrow. We'll get
your review later. Yeah. Yeah, there's there's a press embargo
as they didn't want that coming through.
Speaker 2 (05:22):
Miles.
Speaker 1 (05:22):
We're thrilled to be joined once again in our third
seat by a research associate at the Leverholme Center for
the Future of Intelligence, where she researches AI from the
perspective of gender studies. She's the co host of the
great podcast A Good Robot. Please welcome back the brilliant
doctor Kerrie mcinn.
Speaker 4 (05:41):
Doctor kay, Hi, thanks so much for having me back.
Speaker 2 (05:46):
Thank you for accepting our offer to return, you know,
to class up the joint as I.
Speaker 1 (05:50):
Say, international uh, you know, diplomacy going on. We had
to you know, work it out. But we're thrilled to
have you back. M h.
Speaker 4 (06:00):
How have you been, I mean, how have we all been?
I don't really to answer that question anymore. So I'm
always like, I guess, in the big scheme of things,
I'm personally doing really well.
Speaker 2 (06:11):
But yeah, yeah, thriving, Yeah, yeah, what's the energy? Like,
what's the energy like in Europe looking at the cess
pit that is a flame as known as the United States.
Is it like is it like a tooth like Iraq war?
Anger like I experienced going to Europe orre like what
the fuck are y'all doing over there? Or is it
(06:32):
like how we felt after Breggs there people like, oh,
is fucking themselves bad?
Speaker 4 (06:39):
I feel like it's yeah, a lot of confusion and fear.
I don't know if it's necessarily even just anger as
much as it's just people being like what is happening?
Like what are you doing? But I think there is
a degree of like shell shock to it, where just
so much keeps happening that I think you start to
like become slightly numb, which is very bad but also
very understandable. I got a bit of a refresh from
that because I was back home in New Zealand in
(06:59):
March and like, there's nothing quite like being in a
country when you watch the news every day and like
nothing happens. It's like a tree fell near the motorway,
not even on the motorway, or you know, that kind
of level of needs right, right right, And then it
came back here and I was like, oh my goodness,
this is a very different level of news.
Speaker 2 (07:17):
Yeah yeah, yeah, yeah, Well we're not getting them to it. Unfortunately.
Every day hurts every day. Damn you got anything for
that hurts it every day? Yeah? Yeah, I mean, but
that is the point. I think they want as many
people to turn off and sort of ignore what's happening.
But I think I think as things ramp up here,
people more and more realize how how much the norms,
(07:41):
as flawed as they were, were things that were worth
keeping and trying to improve rather than be like FuG everything,
get rid of it all and whatever this is.
Speaker 1 (07:50):
Yeah, all right, well we are going to get to
know you a little bit better in a moment.
Speaker 2 (07:56):
First, just a preview of some of the things we're
talking about.
Speaker 1 (07:59):
First of all, we just want to get your expert
opinion on where where are we at generally with AI.
Speaker 2 (08:05):
With with AI, like is a one, it's actually called
a one jack a one, like Linda McMahon would say, Yeah.
Speaker 1 (08:12):
Are there cool things happening? Is it that we seem
to see a lot of bad ones happening in the
United States? We want to talk about this, this signal
chat that's been powering the right word shift in tech,
powered by Mark Andresen seems.
Speaker 2 (08:30):
Like a cool guy.
Speaker 1 (08:31):
It's the person whose name that I feel like, now
I have to learn after reading about this. We're gonna
see how Trump wants to use AI. We're gonna talk
about using AI large language models to entrap people online,
people who have signed over their likenesses and now not
so psyched about that decision. All of that plenty more.
(08:52):
But before we get to that, we do like to
ask our guests, what is something from your search history
that's revealing about who you?
Speaker 4 (09:00):
Ooh, well, I mean, I guess do you know the
singer Lord. I'm a very big fan. She's just had
a new single out for the first time in like
four years, and she goes like very underground that just
like suddenly pops up again, and so I'm very excited
for it's hopefully going to be Lord's summer. And yeah,
I've been following the single following the kind of weird
sort of I don't know if you saw her botched
(09:20):
kind of mini and New York spontaneous concert that immediately
got canceled because you didn't have a permit was following that,
and yeah, I'm just excited for whatever happens next, hopefully
an album, hopefully not more botched concerts.
Speaker 1 (09:33):
I was actually talking about Lord this morning to my
children because we were listening to the weird Al song
Foil on the drive to school, which is a royal, yeah,
which is royals and it's just about tinfoil, and they
were like, what was the song about? Originally I was like, well,
there's this really great artist named Lord that you should
have probably learned about before you heard the song. But anyways,
(09:57):
it didn't have a ton of time to dig into it.
Speaker 2 (09:59):
But I'm glad.
Speaker 1 (10:00):
Eric said, I'm on the I'm on the weird Al
watch to see what he's going to be dropping. He's
he's given us an even longer desert to walk through.
Speaker 2 (10:09):
But what's you go? Oh yeah, sorry, I was curious,
Like I was. I saw somebody post on Blue Sky
some kind of meme about how Lord fans are experiencing
like this third album and comparing it to another artist
is like, what, how do you like? It felt like
where people really split on the second album and are
now looking for the third, Like what's as a Lord fan?
How do you look at her discography right now and
(10:30):
what do you hope? Like, Okay, can you just help
me understand what I like what people are talking about
with this new like Lord album.
Speaker 4 (10:36):
Yeah, I mean pretty close to Her third album was
like very divisive, and so it kind of came out
and it was then like sort of twenty twenty one
was like just off the pandemic and it was super
kind of like breezy summary, like it was really different
everything should put out previously, and so I think people
felt pretty split on it. Whereas the second album, which
is also my personal favorite, Melodrama, was this kind of
like buzzy, poppy, kind of intense sort of dance album,
(11:01):
and it sounds like she's going back to like that
kind of sound again, and so I think people are
just like very excited. It's because, like post Charlie XCX,
I like want to hear her do this sound again.
Speaker 2 (11:10):
Got it, got it? Got it? Oh, so they did
the thing, which like you're not doing the thing that
we liked you from the first album, and yeah, okay,
this makes it.
Speaker 4 (11:17):
Yeah, like I kind of like that she does like
different stuff, but also, yeah, I think people are excited.
Speaker 2 (11:21):
That always helps. That always helps me discover other genres.
When an artist I really liked started tinkering with something else,
and before I would be a little bit rigid about
it and be like I don't I didn't sign up
for this. But then you're like, well, I like you
as an artist, so thank you for opening my mind. Yeah, yeah,
kid A was hard for you, huh the kid kid
My God? Yeah, absolutely, I was just I was absolutely
(11:45):
spun by that one. He says, can't be taken back.
I did. I did. I want everything to be the Bens. Actually, no,
Pablo Honey for being honest. Yeah, the first one.
Speaker 1 (11:56):
One, the well how did you experienced the Charlie XCX thing,
because that that was an interesting song. So they had
like a Charlie XCX had a song with Lord where
they talked about there's like a very honest song. They're like, yeah,
I kind of like hated you a little bit, like
(12:17):
but not really, but like I didn't know what you
were going through. No, No, it was like a song
where like they were it was like just talking about
them hanging out and like them being like kind of
just like a weird like celebrity friendship type thing.
Speaker 2 (12:34):
I don't know where were you a fan, Carrie?
Speaker 4 (12:36):
I loved it. Yeah, I never really loved like brat
in general, even though I'm like the least brat person
in the world because I like listen to what her
like party music. Couldn't go to bed at like nine
p thirty. I don't think I was the target.
Speaker 1 (12:48):
What he had better get a sprain of bread in
at four o'clock so I can't have time to wind
down before.
Speaker 4 (12:54):
Dinner prep with. But yeah, no, I love that. You know,
all those kinds of like sliding, awkward, ambivalent relationships. You know,
I thought they captured that really brilliantly.
Speaker 2 (13:04):
Yeah awesome. What is something you think is underrated?
Speaker 4 (13:08):
Super honest baff bags on planes, because like on these
European super cheap flights, they don't seem to do them anymore.
Because I found that out unfortunately last December by just
barfing on the plane because I could not find a
barf bag and I would not recommend that. So bring
your own bag as that first tip to people traveling
to Europe. A second, you know any airlines out there,
(13:29):
they must be so cheap, please bring them back. So, yeah,
I don't know when that happens. I feel like they
always used to have, you know that like sad little
baggie in the back seat for this exact scenario.
Speaker 1 (13:39):
And I feel like I never used them back then,
And now I just had to like have my seven
year old throw up into my hands on a plane
because we couldn't find it in.
Speaker 4 (13:51):
Time, Like, wait, did this happened to you as well?
Speaker 2 (13:53):
Yeah, like within the passage two minutes and you weren't
flying easy Jet or Ryanair.
Speaker 1 (13:59):
No, we were now and yeah, we were flying Asiana
and we had made it through a whole flight that
was like pretty bumpy and got back onto the tarmac
and something about the way it was like kind of
bouncing a little bit. He was just like, ohh And.
Speaker 2 (14:17):
We did not have a lot of time. I'm not
looking forward to inevitably when I have to, because I
feel like that's the thing every parent has to do,
is to buy hand receive that offering from your child.
Like I remember doing that with my mom and I
was wearing overall. I have such a vivid memory. I
was wearing overalls and my mom just put it into
that front pocket of my own wow, And I was like,
(14:37):
there you go, just put that in your overall pocket
because Mommy did not drive this far to buy something
for us to leave now.
Speaker 1 (14:45):
So I've never felt more helpless than walking up to
the flight attendant with a handful of.
Speaker 2 (14:50):
Like you have something, I could put this in Jesus.
Speaker 1 (14:57):
When did you get Because I don't really get a
lot of motion sickness, but I feel like flat flights
are generally pretty steady, and then there's like the quick turbulence.
But like motion sickness for me has always happened when
it's like a slow kind of I don't know, you know,
like you it's like sort of slowly attacking your equilibrium
(15:20):
went when did you have your onset?
Speaker 4 (15:24):
Oh gosh, I mean I feel like I'm quite prone
to motion sickness anyway. But it was one of those
like you know planes, where it's like you get into
the air, the plane to me just like shaking like
one of those toys, and then like ten minutes later
they're like there seems to be some turbuluce, like we know,
don't worry, we got We're not getting out of our seats. See.
I think it was just a bit of a rocky ride.
(15:44):
But also I had like one of those giant like
American style. Even they give you like a leader of
coffee right before I went on. So it was also
self induced in some way. I was too I was
too primed for that to happen. But you know, git
two Holidays or whatever airline we were flying, you know,
bring back the.
Speaker 1 (16:02):
Bag, right, had nothing to do with the air conditions
of the plane. They're like, yeah, sorry, the plane just
does that.
Speaker 4 (16:08):
It's sh the bags out itself, you know.
Speaker 2 (16:15):
Yeah, it shakes you.
Speaker 1 (16:17):
And then they ask you you have to pay twenty
five dollars for barsh bags price.
Speaker 2 (16:23):
It's an add on. It's not like I'm anecdotally. Some
people are saying it's because flights, because like the airplane
technology allows for smoother flights, people are just getting less sick,
Like yeah, in numbers, so then maybe they're dialing it back.
But I don't even if you.
Speaker 4 (16:40):
Don't do it, I'm like, when the zero points there
are one, do it.
Speaker 2 (16:42):
It's real bad everyone else. But I don't mind them
having a bag because I'd rather not have to see
somebody be like, excuse me, then you take this? Hey,
could you pass this up to the flight.
Speaker 1 (16:58):
No, we're not supposed to be getting out of her seat,
But what is something that you think is overrated?
Speaker 4 (17:04):
Actually another plane related one at No BARF. This time
I saw Emelia Perez the movie on the plane and
the whole time I did finish it, but the whole
time I was like, how did this get thirteen OSCAR nominations.
I'm sorry if you're big fans, but I was just like, like, genuinely,
I'm quite impressed it got that many because I was like,
this is just not like, regardless of what you think
(17:24):
about the politics and everything, I'm like, even just with
all of that stripped out and all the drama around it,
I was like, this is just a bad musical. It's
I really, I really find it quite interesting. So I
was like, well, kudos to them for.
Speaker 2 (17:39):
It did so well, you know what, I respect, Yeah,
I respect the game there because you somehow got this
thing straight to the top. Yeah, I mean, idea, I'm
efficient with my movie watching. I only watch whatever we'll
win Best Picture of the Year before it happens, and
I did it this year again. You know, only saw Anora,
don't need to do anything else.
Speaker 4 (17:59):
Yeah, yeah, yeah, the plane So that was great, but
I had to hold my headphone into the headphone jack
for the whole two plus hours of the film, and
I still I still really enjoyed that. So Nora was Yeah,
that was worth it.
Speaker 2 (18:12):
Damn these budget airlines bark bad eggs. You got to
hold the fucking cable to the headphones in the jack
to be able to hear it. Oh.
Speaker 1 (18:20):
I used to have to do that with my headphones
on my phone, like when back when like all headphones
were wired, and like there was like something going on
with my port and so I had to just always
like have it like jammed in. Oh.
Speaker 2 (18:34):
Remember when airplanes used to have the little just plastic
tubing where the sound came through the plastic tubes into
your ears before like the digital like three and a
half millimeter cables. Oh, that was an era. I just
remember the fidelity sound. Yeah. I always remember stealing it
and like trying to like drink stuff with it because
it was just like a big tube with like foam
(18:55):
ear tips.
Speaker 1 (18:56):
Yeah.
Speaker 2 (18:56):
It was like one step above just having a giant
horn that you stuck your ear, Yeah, like held on.
I remember being a kid and bringing the armrest up
and dialing the volume up and just resting my ear
ear on the two little holes where the sound came out.
Speaker 1 (19:09):
Wait, really, you didn't even know that's how it worked,
Like a dumb part of me assumed that's how it worked.
Speaker 2 (19:15):
But the tubes like, yeah, no, it was just a
plastic tube.
Speaker 1 (19:21):
And here we are talking tech ye us like doctor
mc and Ernie here. Yeah, I will say, Miles, you
you should probably start advertising yourself as like you need
to sign the same pr agent as that octopus who
like predicted who would win the World Cup three times
in a row because you only watched the.
Speaker 2 (19:40):
Whatever I stumble upon, the whatever you stumble upon? Yeah,
they're like, what what did he say? I have to
read the vibes. And if too many older people are
like this is the movie of the year, I go,
no it ain't. No, it ain't. And because of that
I will not watch it.
Speaker 1 (19:54):
I think I think it would have won if they
hadn't had the backlash because it add to that same
green book crash thing, Oh right, like by writing it. Yeah,
it's like the you know, a very complex issue and
then a movie written by somebody who has given ten
(20:17):
minutes of thought to it and written a script and
then they like put a bunch of you know, filmmaking
like good filmmaking work against that, although this one, I agree,
this is the music in the musical was not was
not outstanding, but shame shame that a shame that.
Speaker 2 (20:36):
It was daring. They love a fucking daring movie. You know.
All right, well, it's wonderful to have you back.
Speaker 1 (20:45):
We're going to take a quick break, and when we
come back, we are going to talk about technology beyond
the little tube headphones that used to plug it directly
into your airplane seat armrest.
Speaker 5 (20:57):
We will be right back, and we're back.
Speaker 1 (21:10):
We're back, and we on this podcast we ask the
hard hitting questions such as where are we at with AI? Generally,
just like generally, what's the vibes? Doctor McNerney.
Speaker 2 (21:24):
I'm all so every because I remember when we first
had you on, we were like in the midst of
like e AI EI and andlay mommy EI. Yeah, yeah,
AI researchers giving us like the warning like this could
be the next it's gonna end the world. I'm I'm
taking cyanide now to preemptive excuse myself from this apocalypse.
(21:47):
And after a while, I think that's as we started
to speak to more and more experts, we all became
rightly unconvinced by its capabilities and like, oh, it's a
fancy computer program that's like a toy basically, but they're
saying you can do so much more. Then we see
more and more articles about how people using AI are
like fucking up in like legal proceedings like the MyPillow
guy his lawyers use chat GPT recently and the judges
(22:10):
like they're like thirty critical errors in your like your
response here, Like I don't even know what this is.
But now I'm like, are we now sort of I
think because we're sort of past the thing of like
this is that world ender game changer, but still these
companies have to make the line go up. Are we
like now seeing more of a strategy to just kind
of normalize AI use to get people to slowly warm
(22:34):
up to it than sort of like the big explosive
sort of debut with like chat GPT and things that
we saw. Because I feel like now I hear more
people talk about AI when they're like miyazakifying themselves or
getting a recipe recommendation, rather than like this is going
to replace the entire medical field.
Speaker 4 (22:52):
Yeah, I mean, I think normalization is the exact right word,
and I was just thinking this today as I went
on to Google Search, and of course they now have
that overview summary and even.
Speaker 1 (23:02):
Some of my fe who you're talking about, my friend Gemini, Yeah.
Speaker 4 (23:09):
They're yeah, some answers, you know, even things like that.
I think people like me who say probably wouldn't have
gone to chat GBT just to do like basic information searches,
I have to admit I do read the AI summary
is quite a lot, and you know, and I think
that it's a way of just kind of like making
it really frictionless and really really easy to immediately use
an AI tool. And I think we see that with
(23:30):
a lot of like AI roll out and look. And
on the one hand, like I'm not completely against the
use of these AI tools if you're really finding it
useful and if it's reliable, you're kind of getting the
full picture behind the tool and what it's for. But
on the other hand, like I think something I do
worry about is something that I think has to be
integrated into all our ethical tech use is like a
meaningful sense of the opt out or a meaningful sense
(23:51):
of the opt in. So like, do I have the
right in the ability to say, like NAP I don't
want to use this, and you know, am I being
like actively consulted and being able to consent to using
particular tools or programs and so on and so forth.
And yeah, I think the current rollout of AI tools
is not really complying with that particularly.
Speaker 2 (24:09):
Well hmm right, Yeah, they're just excited.
Speaker 1 (24:12):
They have a new toy that they want to work,
that they want to use that they're going to make
a you know, super Bowl commercial about so.
Speaker 2 (24:20):
The Gouda cheese the most.
Speaker 1 (24:23):
Did you see that, doctor McEnery that Google advertised their
AI product with a Super Bowl ad that had incorrect
information in it. The premise was, this is a cheese
farmer and he is a whiz when it comes to
making good cheese, but he's all thumbs when it comes
to writing marketing material. And so he it showed Google
(24:44):
AI like telling him facts that he could put on
his cheese. And one of the facts was that Gouda
cheese is the most popular type of cheese and six.
Speaker 2 (24:56):
The world's cheese sales.
Speaker 1 (24:58):
What which is just like on its surface, like obviously wrong,
and they still like put it in a in a
Super Bowl commercial, Like somebody caught it once. The ad
had like been up and so it didn't make it
to the super Bowl, but it made it like online
and was viewed millions of times. And it's just that
seems to be like it's if if it's not going
(25:20):
to be one hundred percent, if it's not going to
be right one hundred percent of the time, it's kind
of useless because it's just like that. I mean, yeah,
I feel like everybody should be like would be opting
out of it if they knew, like and just so
you know, like one out of like every I don't
know twenty of the things that you search is going
(25:40):
to be like blatantly wrong in a way that is
going to be humiliating to you. Yeah, Yang, we're not
going to tell you which one enjoy the product.
Speaker 4 (25:50):
Yeah. I mean, first, I love the idea that it's
not even the ara but it's just big gudhas out
there trying to spread some misinformation about the popularity of
Guda cheese. But second, like, yeah, I feel like one
way of people describe it as this idea of like yeah,
something like chat GPT or the AI overview can be
useful for getting like a sense of a topic. But yeah,
I guess the issue for me, is like that often
requires quite a lot of expertise to be able to
(26:12):
know whether something is right or not. And so for example,
like my husband as from North Carolina, as a basketball fan,
like I am sort of forcibly inducted into the NBA
enough that yeah, I could probably tell that eighty percent
maybe actually that might be overestimating my knowledge, but yeah,
that twenty percent would be totally out of my knowledge.
And so I think that's my fears. If you're relying
(26:32):
on these tools. It's not that people can't tell that
something's wrong, but you know that when we're using it
for like a really wide range of applications, that does
actually require a lot of expertise in all those different things.
And I think, certainly, speaking for myself, like that's not
something I would be able to do or to discern.
Speaker 2 (26:49):
Yeah, because I mean sure, like you think of it like, well,
if I get a B on my test, but if
you're asking something to like explain the Civil War and
you get all the generals and the battles and dates right,
but the that the reason for the American Civil War wrong,
where it's like and they all fought over you know,
economic rights, and then that's now it doesn't matter that
(27:09):
eighty percent is completely negated by this other piece of
information that has completely colored the you know, the description
of something. Now, That's why I'm like, even every time
I see those summaries, I'm like, no, Like I'll click
on the links that are saying like we're using these
to tell us, and I'll look at them like, this
is not really even exactly what's happening here, to the
point that I feel like it's causing more harm than
(27:31):
good because I've at least learned to try and just
research myself. You know. That's how I got my theories
on the Earth shape and things like that.
Speaker 1 (27:41):
My research some interesting ideas on that. I don't know
if you have a couple hours, Doctor mcinnae.
Speaker 2 (27:46):
Well, doctor, I've every time I've emailed doctor Carry, she's
respectfully declined to entertain the conversation, which I under said.
She has a luck going on. She has luck going on,
but I.
Speaker 1 (27:53):
Mean it is she could be a useful source to
you though you're looking for somebody who's been in an airplane,
you know.
Speaker 2 (27:57):
Yeah, I looked down.
Speaker 1 (28:00):
Are there cool usees of AI that aren't getting attention?
Or I guess even like tech breakthroughs that you know,
we asked this the last time you were on But
like we're still in the early stages of AI. Are
there directions that have like popped out to you that
you know the future of technology and this technology in
(28:21):
particular could take that are promising for like the bettering
of the world for more than like twelve rich guys,
which I feel like is the current the current model.
It's like these twelve guys are like, Yeah, this would
be amazing if we could replace all the people.
Speaker 2 (28:37):
See me with total row. I was with total Rope
from the Miyazaki movie.
Speaker 1 (28:42):
But what where are you seeing hope?
Speaker 2 (28:46):
Where are you seeing hope?
Speaker 4 (28:47):
I guess I'm genuinely terrified and like say something you
would be like, that's the exact same amount so you
said two years ago and destroy Yeah, I mean I
think like I'm always quite excited more creative uses of
AI or people who are like really trying to think
about like instead of saying like how can we make
like one product that works for the whole world, like
(29:08):
which tends to be the approach of things like chatt
GBT and then like spor they obviously don't work for
the whole world and all these different cultural contexts and
all these different linguistic contexts, because I don't think a
single product can, but you know, I do think like
there are really interesting examples of groups like Tahuku Media
in New Zealand where I'm from, which have been like
using AI and machine learning teachiks to try and focus
(29:29):
on trail Mody or the indigenous language of New Zealand's
language preservation. And so because of like long histories of
kind of the state suppression of Terreo Modi, there was
like a period where like there weren't that many Native
Trail Moordi speakers. It was like really aggressively suppressed. And
now as a result, people are kind of trying to
really reinvest and support the kind of revitalization of Treo Mody.
(29:51):
And Peter Lucas Jones, who's this CEO, I believe Teku
Media and like their whole team have been really intentional
and sort of really world leading think and how they've
been trying to use AI machine learning for this. I
think a big part of their project though, is that
they're very insistent on like indigenous data sovereignty or making
sure that like their platforms aren't sold to big tech
(30:12):
or reliant on big tech, And I think I imagine
that's like a really challenging project because like this sort
of small handful of like big tech firms like are
incredibly dominant in this space, but a lot of that
has been around like no, you know, we really want
to make sure that this remains like technology by and
for MALDI people and for our organization. And so I
think projects like that I find like really really inspiring
(30:34):
and really important. But I think they're also just like
a great example of like AI development being done super well,
which is like you have a clear problem, and you
have clear ideas and ways that you think that AI
machine learning can help us address in part some of
this problem without like positioning it as the solution, because
I think if anyone comes out of the gate ands
like AI is going to solve this problem, that's when
I think you should always be a bit like, oh,
(30:56):
I don't think it is, especially if the problem is
something like really really massive, right discrimination is kind of like, well, look,
we just have to reject that one sort of straight
out the gate and kind of think a bit more
specifically about this.
Speaker 2 (31:08):
And I'm sure those companies are like and when we
made that claim, that wasn't actually meant for you to
be the receiver of that message. It was for Wall
Street and investors when we said this thing will solve everything,
because like, yeah, I mean, like that application feels like
the kind of thing that like, you know, in the
US right now, Trump is very focused on eliminating any
semblance of equity or diversity inclusion. Obviously, as we've seen
(31:32):
like the woke DEI initiative crusade against all those things,
and that sounds like exactly the kind of thing that
Trump would be, like, that's not useful. It just has
to be this other thing that's a money making endeavor.
Because right now, his people, like the people within the administration,
like the head of Science and Technology, have said things
like Biden's AI policies were like divisive and it's all
(31:55):
been about all been about the redistribution, like redistribution in
the name of equity. And naturally Trump has fired many
of the AI experts Biden hired because it was clear
like obviously Biden hired them. He's like, there's a huge
bias problem with any of this stuff, and if it's
even going to be a product people use, like that's
probably a thing worth tackling. But now like it's become
(32:17):
sort of like normalized within this administration to say this
is all harmful now because it's trying to advance equity,
when like when we've spoken to people like you and
other experts, it's like, no, you have to get rid
of those inequities or else it doesn't even fucking work.
Like like when you're talking about things like being able
to someone who has like a darker complexion, how does
(32:38):
a like a self automated car even identify that pedestrian
as an object to avoid? Because again, these biases affect
all these different systems. But it feels like now like
at least from the American conservative side of things or
just generally the tech conservative movement is like, yeah, maybe
it's just like fine if it you know, misidentifies like
(33:00):
black people or doing these other things that just kind
of show that at the end of the day, we
would only it only needs to work in the way
that we want it to work, and all the other
applications whatever be damned.
Speaker 1 (33:10):
Keeps identifying black people as traffic cones. Do we think
that we need Can we go to market with this
still or is that are we good here?
Speaker 2 (33:19):
Well, that's yeah, And I'm just I'm amazed at how
you know, I think, how much like objectively, this is
a thing. If you want a product to work, it
has to be able to be used around the world.
So how useful is that in a place that doesn't
have like an ethnic majority that's all white people in
the same way like that? You know, again, these all
(33:41):
just feel very counterintuitive. But that seems to be the
name of this year, this year's theme generally.
Speaker 4 (33:47):
Yeah, yeah, I mean the idea of this is the
year of counterintuitive thing. This really resonates. Yeah, and I
think it's like not only disappointing, because it's like I
do think that it shouldn't be super hard to buy
into the idea that like, yeah, AI that is like
more equitable, less biased, and more fear like genuinely is
actually in the long run good for everyone. Although I
know there's like a reactionary group that feels like any
(34:07):
kind of equity or equality is you know, kind of
impinging on their own kind of share of the pie.
But you know, I really think like AI ethics and
safety is for everyone. But yeah, I also think it's
very sad because like this has huge knock on effis
for the rest of the world, because like the US
is a world leader in AI and tech production, and
so yeah, if you have an environment that is kind
of saying let's throw ethics out the window, then that
(34:30):
does have knock on effis for the rest of the
world their buias and uses still a lot of this technology.
So yeah, I think like these rollbacks obviously massively affect
the US domestic context, but they certainly don't stop.
Speaker 2 (34:41):
There, right, Is there any do you see any put
like this is as stupid on its phase as it
sounds right to be, like we have to we have
to stop with these inclusive efforts to address biases within
AI like models. There's no like it's as dumb as
that sounds, right, because in my mind and everything I've read,
people are like, no, no, no, like it it works
(35:03):
worse when it has all these like inbuilt biases, like
it will not work as good therefore is not viable.
So it is that is bad, right, There's no like
secret things like well, you know, some bias is good
for these things.
Speaker 4 (35:18):
I mean, if someone if that is that is the
secret source, and please tell me. Definitely not what I
would think. I mean, yeah, I mean I think it
just comes down to again, like I think there's a
fundamental irony of saying, like, you know, the power of
these AI enabled tools and products is that you know,
we can perform all these like massive tasks at scale,
and this idea of again like a product that can
be sold across the world, a product that can be
(35:39):
used at scale with the kind of knowledge that this
only really works for like a very very narrow base.
I mean my assumption to be fair though, is like
I think that a lot of people who make products
that have, you know, these kinds of exclusions or biases
aren't necessarily going in being like, I know my product
is really biased, and I actually just like don't care,
Like I don't think that usually is it? I think
often to me, it's just this kind of mindset of
(36:01):
either we just like haven't really thought about it. I
think this particularly common with accessibility, which is that of
the accessibility has to be integrated and from the very
beginning of the design process it can't be slapped on
at the end, and too often I think that's how
it gets approached, and so people just haven't even begun
to think about you know, oh, like, will my language,
I mean, sorry, will my model work for disabled people?
(36:22):
My product work for people with these different kinds of
like physical disabilities. Like, you know, I think it's just
that the whole groups of populations just get ignored or
you know, maybe they've realized and they think, oh, it's
actually a really bad thing that this product doesn't work
for this particular group. But I think I'm just going
to make the trade off and decide, like, I think
that's a small enough consumer base that I can still
sell my product, and like, I don't think I'll get
(36:43):
too much into pushback. I think it'll be fine. And
I think this, you know, might often be the case
for populations that are perceived as being like very very small.
So I'm thinking of say, like trans or gender diverse
people who often get erased from certain data sets because
they're like, oh, well, this is like statistically a very
small percentage. It's like, yeah, but those people's exclusion like
really really matters. It has a huge impact on their
(37:05):
life if they go through a scanner and their body
is not recognized or seen as non normative, or if
they're excluded from different databases like these do have huge
knock on effects for people. So yeah, so I guess
I would say that, you know, the kinds of people
maybe who are not seeing AI ethics as a priority
aren't doing it because they're just like you know, oh,
you know, debiasing whatever. That's fine. I think it's Yeah,
(37:28):
it's probably more just to do with a lot of
different blinkers, like a certain kind of narrow mindset about
who your consumer is and who's like actually using these products.
Speaker 2 (37:36):
Right, Yeah, it does feel probably, But for the Trump administration,
I'd say they're probably very much focused on the fact
that because there it doesn't matter. It's like, I don't know,
even if it makes everything unsafe, I just have to
say the words I don't like equity, and that's without
any consideration for what that means.
Speaker 1 (37:51):
And if the whole world breaks, like the better for
him to consolidate power, you know, Like that yeah feels yeah.
I mean, did you see this Semaphore article about like
the group, the Mark Andresen group chats that like, so
he's been having like these signal chats since the days
of the pandemic, and it's like he has like gone
(38:13):
out of his way to bring in all of these
tech leaders and then like fucking super far right wing,
like people like Richard Hannania, like the guy who's like
an outright white supremacist, and like put these people in
group chats together. Like at one point he like brought
(38:34):
in some progressive people and then they wrote a New
York Times op ed criticizing laws that were banning critical
race theory, and he just like had a meltdown and
was like, you betrayed me by writing this thing, and
like kicked them out of the group chat, and like
since then, it's just all this like extreme right wing
(38:57):
propaganda that's being like kind of vomited back and forth
between these people who are like the biggest, most powerful
oligarchs and like the people who are in charge of
the direction that tech takes. And so I yeah, I
mean it feels like this whole thing is developing in
a way that feels particularly like non optimal and like
(39:20):
stupid and narrow minded and racist and white supremacist and
all those things, and like that this was very helpful
for a context. I just wonder like all of like
we're seeing a model that's being developed not in the
best way possible. It seems like like to say the least,
(39:42):
and we're probably going to like discount a lot of
these possibilities because they're so shitty at what they're doing.
But do you see that sort of white supremacist mindset
kind of pervading in the tech mainstream?
Speaker 2 (40:00):
What a question?
Speaker 4 (40:02):
Yeah? No, I mean I feel like I guess, like
what I do think this giesses towards. Is this like
very public repositioning of Silicon Valley, which I think always
had this like relatively liberal veneer, even if it's not
clear how deep those roots actually went kind of very
explicitly aligning themselves with the right and with Trump. And
it's a little bit hard to tell, like how much
of this is like political expediency people saying, well, clearly,
(40:25):
you know, I'm going to do anything to avoid heavier regulation,
we do anything to avoid this kind of like sort
of punitive regime that Trump's exacting on many different institutions,
So we're going to you know, put ourselves in his camp.
And how much of this is like an expression of
like deeper ideologies and deeper kind of political beliefs that
have maybe sat door mentor sort of have kind of
(40:46):
been cultivated within Silicon Valley and tech firms and now
kind of finally bursting onto the scene now that they
feel a bit more empowered, as there's kind of been
this like global shift towards the right. So yeah, I
don't know, honestly, like how much, which you know, we
can discern between sort of like the real deep feeling
and the kind of political expediency argument. But I think
(41:07):
it's just undeniable that you have people like particularly Mark Zuckerberg,
who would have been like normally more kind of center
possibly center left a lot of issues even though he
was obviously running like a massively exploitative, gigantic, quite dangerous
tech company now sort of really aggressively rebranding as a
kind of wooing Trump, but also sort of big into
(41:29):
a lot of these like hyper masculiness, tech bro sort
of languages and ideologies that I don't think certainly five
years ago we would have seen him sort of supporting
in the same way. So yeah, it feels like there's
been this sort of very visible shift in Silicon Valley.
But again, as someone who's not based in Silicon Valley,
like I probably couldn't tell you like how much that
feels like this this is actually the real face of
(41:50):
the Beast versus this is actually just like what people
are doing for the moment.
Speaker 2 (41:54):
Yeah, because I mean you see how much like Andresen
got Silicon Valley money together to get Trump into office,
along with a bunch of other crypto people's like seventy
eight million dollars from like the and recent side of things,
and you know, like these group chats they do, it's
like our modern day smoke filled lounges where you get
to see these very powerful people sort of debate these
(42:17):
topics and get their takes out there that are like
wacky as hell. And that's why I think, like reading
this article, it was a little bit more like damn.
I mean, I don't know if everybody in this group
chet thinks that, but there are definitely some vocal people
in this group chat that definitely think they are the
ones who are going to solve these very complex issues,
(42:37):
but in the most like inelegant, one size fits all
kind of way. That's just all about power. And that's
when I'm like, oh, I think it's really these are
starting to blend together, especially as we see how much
how people's like media diets and the information they receive
are informed by what these people who are running these companies,
(42:58):
how they believe in like how information should move and
how people get siloed into information bubbles and things like that.
That's when I started to begin to be like, mmm,
this feels a little bit more like a smoking gun
when you hear like, you know, these ideas are being
like exchanged and knowing that like there's this guy Curtis
Yarvin that we talked about a few weeks ago who's
like basically like a tech monarchist who has a lot
(43:20):
of ideas that people like Elon Musk and Peter Teel
like are really subscribed to. And yeah, and like in
these group chats. It's is where these a lot of
these very influential people start getting introduced to these kinds
of ideas. So that's when I'm like, hmm, this feels
very sinister at best. I mean, like when you see
(43:41):
this and then thinking that these are the people that
control the levers that you know, just normal people who
are using the Internet and stuff get affected by their policies,
it feels like a little bit like they figured out
this is how they exercise like their immense control over
people in these sort of you know, less sort of
not like in the ways that we think in terms
of governance or whatever, but through the spread of information
(44:03):
and ideas.
Speaker 4 (44:04):
Yeah, and I guess I feel like the signal chats
like raise two things about this kind of like pivot
to the right, which I think are both really important,
and like, I think the one is what you've pointed out,
is this like small scale influencing. Like I think often
when we talk about disinformation or misinformation, we're thinking about
that in terms of like, oh, someone makes like an
AI deep think and they like release it onto the internet,
and somehow that deep fake like convinces like many many
(44:26):
many people that this event occurred or didn't occur, and
it has like seismic effects, and like, you know, again,
I don't think that's necessarily how disinformation works. I think that, yeah,
these kinds of like small cultivated circles of trust and
maybe the things that we should be really really concerned about,
which is like what actually causes people to change their
mind or change behaviors, Like maybe these are the levers
that can like be significantly more effective. But yet the
(44:49):
second is kind of a point you were raising, and
I think does again like tie into what happens when
you build these bonds of trust, which is just like
following the money. And like you mentioned sort of the
massive amount of tech funding that got t I'm at
the office and I had a colleague who now wiks
to The Guardian, and he and the team kind of
just like tracked all the funding behind the Trump and
Harris campaigns, and what was just really astounding was just
(45:11):
like the extraordinary amount of money that specifically just came
from tech, and like that's been a really big change
in the last five years or so. Is just like
how much money sort of Silicon Valley has deally putting
it into political lobbying. And so I think, you know, yeah,
it's even regardless of like your political orientation, but like
particularly right now with the Trump administration, Like I think
(45:32):
that's something we should all be really concerned about.
Speaker 2 (45:34):
Mm hmm, yeah. I mean, because it feels like this
is the best way. In some level, the tech industry
is sort of like, well, we're actually now the ones
that are really able to manufacture consent through social media
and misinformation, and you marry that with like a you know,
presidential administration that would really benefit from that kind of
like full court press propaganda. Kind of making and it
(45:58):
feels like, oh, everyone kind of wins in their own way,
although there are clearly some people intec have their regrets
after all the tariff chaos, and they're like no, no, no, no, no,
no no no not actually, also, don't just fuck that
one's actually really valuable. I just wanted less regulations and
for people to say the N word on Twitter more.
That was my whole thing, and now I have no
(46:20):
money or less money. But yeah, it's it feels like
it's just like the most clown show version of all this,
I think, which is also very frustrating to watch or
just have to sit idly buyas it's all happening to
us very intentional.
Speaker 1 (46:35):
Very Also, it's just like really pathetic, just like seeing
how easy, easily influenced these people are. I mean, it
makes sense because it feels like they're from a church
that believes that whoever, wherever, the most money is equals
the right thing, and so you just like get them
on a group chat with a billionaire and they're like, well,
(46:57):
I mean that guy must be the smartest guy in
the world, and so right his ideas I have to
listen to.
Speaker 4 (47:03):
Yeah, and I do think as well, like, you know,
to some extent, I'm like, yeah, to that point, much
less interested in what people like quote unquote really think
and much more interesting in just what they do. And
so to some extent, I'm like, yeah, you know, does
it really matter whether Mark Zuckerberg is personally invested in
these like you know, quite misogynist ideologies about like what
it means to be a man? Or doesn't matter that
(47:25):
he like has gone on Joe Rogan and now like
propagates a lot of these ideas and has you know,
publicly shift admitta to kind of align with or support
Trump's administration. Like that, to me is like what really
matters right now, which is that we have kind of
these active, kind of shifts of power in the tech
industry that go beyond kind of like politically signaling in
certain ways, Like they're really really tangible, And I think
(47:45):
Musk is like the flagship of that in terms of
someone who's just been like very very visible in the administration.
But you know, he's certainly not the only one.
Speaker 1 (47:53):
Yeah, it feels like there's no shortage of these tech
billionaires who are dead set on a horrifying idea.
Speaker 2 (48:00):
All right, let's take a quick break and we'll come
back and keep talking about AI. Will be right back,
and we're back. We're back.
Speaker 1 (48:19):
And one of the trends that we've seen is the
police and other forces trying to use fake AI personas
who are there to get college students to admit that
they haven't like signed the National Pledge to support Israel
or whatever, and you know, or try and radicalize people
(48:43):
or you know, all the all the entrapment and shit
that we were seeing like as early as like the
Oughts with like nine to eleven and the quote unquote
war on Terror. But first of all, I'm just curious,
Like it seems like the sort of thing that would
be hard to get fooled by. But the Internet is
obviously massive and full of like stupid people. But like,
(49:06):
does this sort of application concern you? Are the models,
doctor McNerney, like advanced to a degree that like, you know,
I could get pulled in by a bot that like
claims to be a college professor who once has some
(49:26):
interesting literature for.
Speaker 2 (49:27):
Me to read.
Speaker 4 (49:29):
Yeah, I mean I guess for me, I feel like
this is how most of these scams work. Though, right
as you always think you're never going to get taken
in by one until you get taken in by one,
And I think they really, you know, they benefit from
that kind of sense of confidence, but also just like
more broadly from the fact that like we have to
have a degree of trust now interactions with people to
like function societally, and like that's what's so damaging about
(49:51):
these like fraudulent or fake uses of AI. It's like
they really affect that kind of shared trust.
Speaker 2 (49:56):
Yeah, I get so many.
Speaker 1 (49:58):
I think I'm like confident off of all the text
messages I get. They're like, Hi, do you want to
get lunch? And it's like a new number. I'm just like, well,
I know this person's not real. So once again ninety
nine and oh when it comes to telling when a
bot is coming for me, but then I did get
into a long term relationship with a bot, so but no,
(50:20):
uh well, yeah, are there are there like strategies for
telling whether you're chatting with it? Like is there a
turing test? Like not obviously like specific questions like in
a Blade Runner, but like overall things that they don't
deal well with.
Speaker 2 (50:37):
Or that like a bit of misinformation here when you're
a young kid and you're like you don't when a cop,
you got to ask them if they're a cop, and
they got to tell you right now right or else
it's a trapman, Like you you're not an AI, right,
I mean, is it that easy? Is there a silver bullet?
Speaker 4 (50:52):
That's a good question, because yeah, I mean I do
think like part of the interesting thing I think about
a lot of scams with this is that my colleague,
the philosopher clear Bent, says, she's like, often they're not
intending to try and make their contact the most believable
with you, like when you get these like mass spam emails,
Like I think this happens to most of us. As
we look at something, we're like, that's obviously spam. And
(51:14):
my colleague clear says, it's like, well, it's because like
they don't want to be getting a huge pool of
people who are like not actually that easily food getting
bought in to the first step, because like these people
are not going to go all the way like they
want the people who they think will go all the
way with them and hand over money or like who
will like even not feel confident enough to challenge what's happening,
or they'll be really like bored into what's happening. But
(51:36):
you know, yeah, like the kind of core of like
the scamming is not necessarily saying like oh, it's about
crafting the perfect email to get people through the door.
It's about like, no, who can we actually take to
the end. And I think like, yeah, a lot of
these AI chatbots, it's like we might look at this
and say like, Okay, well this seems really sketchy, but
like they just need to get enough people who are
like oh okay. But actually, like a lot of the
interfaces I use online are like not super great or like,
(51:59):
you know, I might not find them very intuitive, and like, oh,
this is kind of a scary thing that I've been
suggested to do, or like they've made a threat about
like my money or something like that. So I've got
to now like stay engaged in this conversation. And you know,
so I think like to some extent, like there is
like a particular exploitation of vulnerability that happens in these
in these scams. But yeah, in terms of like a
(52:20):
touring test for these chat bots, I don't know, because
I feel like, also, yeah, like I feel like people
are often like oh, like I can totally tell when
I'm changing to a chatbot, and I'm like, I don't
know if I always can. I think you know it is.
They can be like especially with something like chat GPT,
Like you know, I know with people say anecdotally have
talked about, say trying to discern whether like chat GPT
(52:42):
has been used to write essays, like that's more common
in the context that I'm in higher education, and like
sometimes they can get like really clear, like really clear keywords,
or they'll say okay, like the people keep using a
phrase that is like not super normal to use in
this discipline, but like probably is quite normal if you
use AI. So they might have keywords like that, you
might have really obvious tells that they've kept the prompted
(53:04):
like okay, please write me and essay about it.
Speaker 3 (53:07):
So that's kind of okay, that's there is your essay
question on the following prompt, right, But.
Speaker 4 (53:12):
Yeah, I mean I think to some extent, like you know,
I do worry a little bit if people are like, no,
I completely believe that I am like scam proof when
it comes to AI, because I'm like I do think,
you know, compared to like five years ago, like we
do just have like much more sophisticated voice cloning technologies,
language models and so on and so forth, that might make
it a little bit harder to discern in that kind
(53:34):
of like initial first pass. But this is just like
my instinctive thoughts and like maybe this actually yeah, like
the version of like a cop has to tell you
a cop for AI.
Speaker 2 (53:41):
Yeah. I mean, if you're asking my opinion, Jack, I
think it's great that the cops are using AI do
and trap people online. I think I'm not concerned. I
think I was just looking for guys. Yeah, we're you're
as long as whatever makes the cops jobs easier, you know,
I'm all for that. I back the blue, but that
is too much overtime, Pey, No, I mean, but that
(54:04):
is interesting, doctor carry to consider. Like I didn't think
about how it's like you need it to be shit,
so you put off the people that are going to
be smart enough by the time it comes to the
part where you ask money, because if they're already in
at the most wacky version, like Carol, when are we
going to play golf this weekend? And they're like, huh,
I'm not Carol, and they're like exactly that funny.
Speaker 1 (54:28):
I got the most interesting investment opportunity right right right
right from Carol's golf buddy.
Speaker 2 (54:33):
Who's Carol. I'm not sure. I don't know, I don't know.
Speaker 1 (54:36):
Yeah, yeah, I mean I feel like the images are
the concerning place, Like it just the degree to which
AI has made photoshop really really easy, Like there's just
I don't know, it's degraded, the degree to it. Like
there was just a story over the weekend where a
lot of people were fooled by AI created mug shots
(55:00):
of the judge that the Trouble Ministry arrested and like
making it look like she's like weeping as she's being arrested.
But like, I just feel like there's just no no
way to believe any image that you see, Like you
just need to keep googling ask ask whether it is
(55:20):
AI or not. And then of course you're asking Google's
AI whether it's AI or not, and you know they're
probably in on it too. But what what would you say?
What are the uses that you feel like are most
concerning that you're you're kind of keeping an eye on
that you see evolving the fastest.
Speaker 4 (55:37):
I mean, I think what you've actually pointed out to
me is the thing that like I get the most
worried about personally, which is less the production of like
individuals pieces of media like video, deep fakes or photographs,
even though those can be really really scary and quite
concerning when you see them made. Like for me, it's
like that broader point you said of like I don't
really know what I can trust, and like I think
the creation of deep fakes in and of themselves, like
(55:59):
you know, they undermine sort of our broader trust in
our information environment, and like this is happening at a
time when out information environment is already very chalic and
quite hostile, and so like, yeah, I really feel in particular,
I mean I feel for all of us, but I
really feel for young people kind of growing up in
this like very intense, polarized, kind of extreme media environment.
(56:20):
And I don't think that kind of the possibility of
deep fakes and the sort of this idea that like
anything creative could be fictitious is going to necessarily help
that or help people believe that they can have access
to kind of verifiable and true information. So like to
give an example of this, like one example here is
this idea of like the liar's dividend, which is like
anyone could say something kind of mean or offensive or untrue,
(56:43):
and then you know, when confronted about it, they could say, like, oh,
that video clip's not true, like that was a deep fake,
Like that's fake news, to use a very common term
from US politics, And you know, it could be that actually,
like ninety nine percent of people are like, no, we
really think you said that. That sound like you like
we've done all these checks and we think this is
real footage. But like you can plant just snuffed out
(57:05):
in people's minds. You can plant that one percent where
people are like, but what if actually this is is
a deep fake, Like what if this isn't true? And
I think that's how that you kind of get this
erosion of sort of the information landscape that makes everything
that little bit shakier. Because yeah, again, like I think
when it comes to say, like the impact of like
AI generated misinformation sort of on things like elections, Like
(57:29):
there was a lot of fear around this last year
because last year was kind of had all these democratic
elections called like the Year of Elections, and also sort
of the emergence of like really really sophisticated generative AI
tools coming out at the same time and it seems
like actually the verdict on that is a little bit mixed.
And so you know, one kind of study looked at
this and they said, well, actually, of you know, just
(57:50):
under eighty pieces and media we looked at of these
AI generated sort of pieces attached to different political campaigns
or elections, and like, you know, half of them like
weren't necessarily misinformation, and then the other half of them like,
you know, yes, a lot of like they were made
by AI, but like a lot of them really were
like more like cheap fakes or like they were not
necessarily actually deeply reliant on AI tools. It could have
(58:13):
quite easily been made something with like Photoshop for example.
So yeah, AI has made the production of these particular
pieces of media, but it's not necessarily actually like fundamentally
changed what we're able to do necessarily in these particular cases.
So yeah, so I think like when it comes to
sort of like the cause effect of saying, like, you know,
we're going to have these specific pieces of disinformation, they're
(58:35):
gonna have these specific negative effects, I think, you know,
the picture there is maybe not as extreme as some
of the media fear around it last year was portraying,
But yeah, I think this broader question of like how
do we know what we can trust like that remains
really central.
Speaker 2 (58:49):
Yeah, I feel like there's just also outside of obviously
like the geopolitical things that are very front of mind,
because that's you know, the things that has an effect
on a lot of people. I just see it creeping
into so many creative spaces that I'm like, it's cheapening
so many things because like, for example, like Pinterest, if
you look for anything around like architecture right now, I
(59:12):
was like looking for something it's just like in because
I'm in the process of hopefully like rebuilding my home
of like trying to look at architecture pictures, eighty percent
of them are bullshit AI slop images where you're like, oh,
that's an interesting kitchen. You're like, why is the sink
in the wall like this? Like wait, hold on what
And I'm like, then none of these power chords like
(59:33):
line up with where the lamps are, And it just
sort of it like gets to the point where like
this is this isn't even real stuff that I can
draw inspiration from or even like on Spotify, like there's
so much more AI generated music that is also kind
of taking like shifting people's focus. Like I also see
the way we listen to music is really changing, where
(59:54):
it's much more passive and you're being fed music rather
than like seeking music you used to you know, you
used to go buy a CD, which I don't mind
because I'm I love when I discover new music through
these kinds of these sort of like algorithmic playlists and
things like that. But it's also created like a whole
other like genre of music that's like a fake artist
(01:00:16):
that has like is just putting up sort of texture,
like really uninteresting music to have in the background. It
there's some really insidious that like word for it. It's
like functional music or just something that it's like is
not really meant to listen to, but just on passively.
And when I see all that, I'm like, oh, this
is this really also takes away from like the magic
of human creativity and you know, just the the skills
(01:00:39):
that people build through composition or design or whatever, you know,
because everything is being flooded with things that are just
kind of like based on it, but not you know,
having that real sort of sincerity to it. And that's
that's another one that I see happening more and more
and more, and like, ooh, this feels like this is
going to be a slippery slope too. Now I think
it'll be fine anyways. Yeah, yeah, yeah, it's fine, it's fine. Yeah,
(01:01:04):
It's all a one. It's all a one. As Linda.
Did you see her.
Speaker 4 (01:01:07):
Say that, I unfortunately am aware of this.
Speaker 2 (01:01:11):
Yeah, Linda McMahon's secretary of Education repeatedly will describe AI
as a one. Wow. Yeah, it's like as if we
weren't just bad enough, bad bad all around, even if
she's ready. Yeah, yeah, it was like, I mean, with
the rise of a one and you're like, are you
not even around people who are saying it out loud?
(01:01:34):
Or do you love people with the worst fucking taste
in steak sauce?
Speaker 1 (01:01:37):
Right?
Speaker 4 (01:01:38):
But that's that's why I worry though about, you know,
the numbness I mentioned at the beginning, where like when
I had that, I was just like, yeah, that sounds
that sounds like what I would expect at this point
someone out they're saying a one. I'm like, no, I
should not be like right in any way have this normalized?
But like I feel like we've hit that stage where
it just seems to be this like total disregard for
(01:01:59):
any kind of like strategic policy planning or expertise, and
instead you get the kind of a ones. Yeah, well
without there.
Speaker 2 (01:02:07):
Are my a ones, my day ones.
Speaker 1 (01:02:10):
Well, doctor mcnernie, has been a pleasure having you back
on the daily Geist. Where can people find you? Follow you,
hear you all that good stuff.
Speaker 4 (01:02:18):
Yeah, well, if you're interested in more conversations around feminism technology,
then highly recommend you check out my podcast with doctor
elenoald Ridge, The Good Robot Podcast. And yeah, if you're
interested in kind of some of the topics of discussion
we've been discussing today. A couple of other substacks and
channels maybe to shout out is AI snake Oil is
a really really great if you're interested in kind of
(01:02:39):
maybe trying to debunk some of the hype around AI,
trying to like get the grips of like, Okay, what
can all these different AI tools do or not do?
I think they do a really great job. And I
think Mystery AI Hype Theater three thousand, I always get
the name mixed up, but like that's also a good
fun and they have a lot of fun. I think
kind of trying to look through various sort of you know,
(01:03:01):
slightly sketchy looking AI projects together and sort of you know,
tearing them apart. So would definitely recommend those shows too nice.
Speaker 2 (01:03:08):
Is there a work of media or social media that
you've been enjoying?
Speaker 4 (01:03:13):
Oh gosh, I mean, I'm not proud to say this,
but I'm actually like writing a book chapter at the
moment on like internet scam culture. And so as part
of that, I've been going back to, you know, the
great debacle of Fire Festival, And obviously I'm like, I
think this is indicative of like much bigger societal problems.
But the memes that didn't come out of Fire Festival
were unfortunately spectacular, and so I have been going back
(01:03:36):
through some of those, like in the now very mature
with like five year old cheese sandwich memes that came
out of Fire Festival. So that's been my like current
social media, why are we doing that like old albums?
Speaker 2 (01:03:49):
Like you're like, man, I remember the memes when Trump
got COVID, Remember the memes when the Queen died? Like
just like you're like, man, the way I was get them?
Speaker 4 (01:03:57):
Yeah, like they are you just like you? So like
here it was like when Liz trust uh was kind
of you're really going through it like that was just
like relentless political mediums. Uh RICI soon a can you
ask your election? But yeah, you just kind of get
swept away. So you know, maybe that's the new thing,
the kind of meme album.
Speaker 2 (01:04:16):
I'm ready. I'm ready for a meme album. Miles, Where
can people find you as their working media? You've been
oh me, you find me everywhere they have at Symbols,
at Miles of Gray. If you want to hear me
sob this week about the state of the Lakers, that's
on Miles and Jack got mad Mass He's ourn NBA podcast.
(01:04:36):
I mean, whatever, whatever, whatever, whatever, you know, it's just fine.
You know, we're just gonna have to come back from
three to one, you know what I mean. He's done
it before, we've done it before, and that's I guess
that's what it calls for. That's what it calls for.
Or I will just ignore everything and go full fetal
and sob every night. But there's that, And then if
you want to hear me talk to ninety day Fiance,
I do that over at four twenty day Fiance. A
post I like from Blue Sky is from Eric Columbus
(01:04:58):
at Eric Columbus dot beaskuy dot soci. He quote tweeted
a tweet from a reporter and it's first of all,
the quote. The tweet that's been quoted says, because the
Canadian elections had just happened, it said one couple at
a voting station in Port Credit said they would rather
not speak to American American media. They then apologized three times,
and then they quote tweeted it with saying peak Canada. Yeah, yep,
(01:05:22):
I get it, I get it, don't speak to don't,
but I'm also sorry we won't. We can't speak to
American med.
Speaker 1 (01:05:28):
Tweet I've been enjoying from kathle Hughey Lewis and the
news at Treye Dessert tweeted instant deleted tweet Hall of
Fame nominee, and then retweeted a New England Patriots tweet.
So they drafted somebody named Kobe Minor with the very
last pick in the NFL draft, and they tweeted an
(01:05:50):
image of him with the words a minor because his
last name's Minor. Amazing, Oh Kobe. Anyways, you can find
me on Twitter at Jack underscorel Brian on Blue Sky
at jack O b the Number One. You can find
us on Twitter and Blue Sky at Daily Zeigeist. We're
(01:06:11):
at the Daily Zeitgeist on Instagram. You can go to
the description of the episode wherever you're listening to this
and you can find the footnotes, which is where we
link off to the information that we talked about in
today's episode.
Speaker 2 (01:06:22):
We also link off to.
Speaker 1 (01:06:23):
A song that we think you might enjoy. Hey, Miles,
is there a song that you think people might enjoy?
Speaker 2 (01:06:29):
Yeah, this is a group called Reo Costa Ko st
A And this is a track called Ancients. And so
if you like jungle, if you like uh, you know
that that sort of like Came and Paula. This feels
like kind of a mix between like sort of psychedelic
e but funky with a lot of with falsetto vocals.
(01:06:50):
I think it's great, very Look, you know this is great,
great music. It's it's time to let our big toe
shoot up in our boop. Okay. So this is Ancients
by the group Real Costa check in Out.
Speaker 1 (01:07:00):
We link off to that in the footnote that All
these Guys is a production by heart Radio from our
podcast from iHeartRadio. Visit the iHeartRadio, app, Apple Podcast or
wherever you listen to your favorite shows.
Speaker 2 (01:07:09):
That's gonna do it for us this morning.
Speaker 1 (01:07:12):
We are back this afternoon to tell you what is trending,
and we will talk to you all then, Bye bye.
Speaker 2 (01:07:18):
The Daily Zeit Guys is executive produced by Catherine Law.
Speaker 4 (01:07:21):
Co produced by Bee Wang
Speaker 2 (01:07:23):
Co produced by Victor Wright, edited and engineered by Justin Connor,