Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:07):
Guys from Kaleidoscope and iHeart Podcasts. This is tech stuff.
I'm mos Voloscian and I'm Cara Price. Today We've got
two big stories to break down for you. First, could
China unseat the US in more than just AI and
(00:28):
catch up in pharmaceuticals too? Then inmates phone calls with
loved ones are being used to train AI that then
monitors their behavior.
Speaker 2 (00:38):
Then we'll tell you about a few other stories that
caught our eye this week, like how the UK police
want to cross reference CCTV footage with government databases, and
how the youngest female self made billionaire made all her money. Finally,
we discussed giant blueberries and why the company is selling
them is valued at a billion dollars.
Speaker 3 (00:58):
Then on chat to me, there was concern, you know,
when two nuclear armed states are on the brink that way,
that this could escalate and explode not just regionally but internationally.
Speaker 1 (01:08):
All of that on the weekend Tech. It's Friday, December twelfth.
Speaker 2 (01:14):
Hello Cara, Hi Ahs, do you follow art Basil.
Speaker 1 (01:18):
I've never been Miami, I know, but I'm fascinated from Afar.
It's like the most hype.
Speaker 2 (01:24):
It is the hype beast event of any time in
human history.
Speaker 1 (01:29):
Last time it broke the internet was with the like
twenty million dollar banana. That was last year or two
years ago.
Speaker 2 (01:35):
Probably was for you offensive. It was very offensive. So
let me ask you a question. Have you seen the
robot dogs I have?
Speaker 1 (01:42):
Okay, robot dog?
Speaker 2 (01:43):
Would you like to describe them a little bit? What
you saw?
Speaker 1 (01:45):
Okay? So you remember the Boston Dynamics dogs. Yes, they
were these sort of uncannily dog like robots that kind
of pranced a little bit like like not like the accidents,
like what are those greyhounds like little greyhounds great guns?
So imagine robot baby prayhounds with very raalistic masks that
depict teche billionaires.
Speaker 2 (02:07):
You described it perfectly.
Speaker 1 (02:08):
Well together in a kind of pen.
Speaker 2 (02:10):
That's exactly right. So it's an odd group of people
dogs in this pen. It was Jeff Bezos, Mark Zuckerberg,
Elon Musk, Pablo Picasso, Andy Warhol, and also a dog
of the artist people who created the dogs. The masks
that you were talking about are made by a special
effects artist, nam Landon Meyer, and they are these hyper
realistic faces. Is exactly face exactly like human faces.
Speaker 1 (02:34):
So I saw the images, but I didn't really know
what is this all actually about.
Speaker 2 (02:38):
So basically people wanted to make the point that it
used to be that we saw the world interpreted through
the eyes of artists, but now Mark Zuckerberg and Elon
in particular control a huge amount of how we see
the world. So he wanted to show that basically they
are the cultural conduits of our time.
Speaker 1 (02:57):
And what do they do apart from run around the pen?
Speaker 2 (03:00):
So this is what I love. So they run around
the pen, but when you watch them, you can kind
of see them, what's the word that I'm looking for,
shitting out pictures that are in the style of the
artists that they are depicting or the tech bro So,
for example, the Andy Warhol robot dog will produce an
image that looks like a silk screen, and Picasso's is cubism,
(03:23):
Elons is black and white, and Zucks, to quote people
looks like the metaverse.
Speaker 1 (03:28):
Wow, So are they are the dogs? Are they taking
photos of stuff they're seeing in the audience and then
turning them into works.
Speaker 2 (03:35):
Of arts works of art? That's correct, but the actual
dogs themselves are selling for one hundred thousand dollars each.
Imagine just having someone in your house and they're being
a little Elon musk dog running.
Speaker 1 (03:46):
Around taking photos of your guests and then pooping them
out crazy. I gather though, that the Bezos dog is
not for sale and doesn't poop. That's because he's King
of the Teche Brothers.
Speaker 2 (03:57):
That's because he's king of the tech Bros. I just
thought it was an interesting story to start with because
I think that People is making an interesting point.
Speaker 1 (04:04):
It's a tech Burrows world. We just shouldn't it.
Speaker 2 (04:07):
This is me high fiving them.
Speaker 3 (04:09):
Yeah.
Speaker 1 (04:09):
And also, I mean, look, so people really shot to
fame during the height of the NFT craz That's right.
He sold one a single NFT for I think sixty
nine millions dollars.
Speaker 2 (04:19):
Which speaking of something that Bess could shot out that
amount of money.
Speaker 1 (04:23):
And I believe some of the images that the dogs
poop out our NFTs that you can also buy. And People,
as you mentioned earlier, is also one of the dogs.
So I think he's poking fun at himself and his
ovre a little bit absolutely right here too. And I
mean you could say this is absolutely grotesque and.
Speaker 2 (04:42):
People are paying money for it.
Speaker 1 (04:43):
On the other hand, the guy is a master capturing
people's attention, getting the hype cycle, and forcing people to talk.
When was the last time you saw an article in
page six questioning how the tech oligarchs made us see
the world through their filter.
Speaker 2 (04:57):
He did it, He did it, He did it.
Speaker 1 (04:58):
Good on you people. Yeah, I'd like to move on
now from big tech to big farm and specifically take
investments in and around pharmaceuticals. A while back, I read
this piece in the Ft which asked the question why
is AI struggling to discover new drugs?
Speaker 2 (05:14):
Because why is AI struggling to find new drugs?
Speaker 1 (05:17):
Well, that's a good rually tell me. I didn't really
realize this until I read the ftpiece. Obviously, I'm fully
aware of all the investment in AI drug discovery now,
but it turned out there was kind of a first
wave in the like twenty thirteen to twenty seventeen period
where hundreds of millions of dollars was invested, And the
kind of period when you would normally expect a drug
(05:38):
to come to market is about ten years.
Speaker 2 (05:41):
So if my math is right, we should be seeing
like a whole bunch of new drugs come onto the
market now.
Speaker 1 (05:47):
That's right, But we're not and according to the Ft,
there's a few reasons for that. One is money. Essentially,
these companies couldn't bring their product to market quick enough
to keep raising money to keep their companies alive, and
investors lost patients. I mean, to be fair to these companies,
drug discovery and the trials to actually bring drugs to
market are incredibly expensive, which is part of the reason
(06:09):
why these big farmer companies have such a lock on healthcare,
because they're the only people who can afford these cycles.
Speaker 2 (06:15):
Right, right, right, So it was more of a funding issue. Ultimately,
I think it.
Speaker 1 (06:18):
Was a funding issue and also a technology issue. I mean,
don't forget Back in the day, the AI tools were
nowhere near where they are today, So you basically have
to choose one problem and develop a specific AI to
tackle it, whereas now you have these broad AIS that
can tackle multiple problems simultaneously. So basically, according to the Ft,
(06:38):
the decade long clock kind of got reset in the
aftermath of the chetchipt moment two twenty twenty three, twenty
twenty four. So what happens next, Well, we'll have to
wait and see whether this great promise of AI to
deliver new drugs takes place in the next five to
seven years. One veteran chemist interviewed in the piece said
the drug discovery is quote probably the hardest thing mankind
(07:02):
tries to do. Another person said, who's actually a founder
of one of these new generation drug companies? Quote? I
used to say we were the industry with the highest
failure rate of anything but Space Explorer space. And then
she goes on to say space explorations started to work.
Speaker 2 (07:19):
Wow to see, Yeah, that's very interesting.
Speaker 1 (07:21):
I read another piece in the Ft, also discovery. I
love the Ft and I guess I love drugtor Sorry.
The question this piece asked is will the next blockbuster
drug come from China?
Speaker 2 (07:31):
Well, that's really interesting. So there's almost like a parallel
arms race in pharma and AI with China.
Speaker 1 (07:37):
And the INTERSETXT and you can kind of see American
farmer companies begin to make very substantial investments in China.
For example, Novo Nordics, the company who developed a zepig,
paid a Chinese company up to two billion dollars to
license a next gen weight loss drug. So yeah, it's
kind of it's that's kind of parallel thing where China
(07:57):
was the factory for US innovation for so long and
now are starting to really develop and refine and make
better products, or at least equally good products, faster and cheaper.
Speaker 2 (08:07):
So how are they doing this?
Speaker 1 (08:09):
Well, I mean, it's kind of that thing where the
government says this is a priority, and also we'll just
cut through all of the red tape required, will make
trials easier, will make recruiting for trials easier, will approve
the construction of new facilities in days rather than months
or years. They basically said, this is a government priority
to become a leader in pharmaceuticals, and therefore we will
remove absolutely every obstacle, whether it's financial, whether it's regulatory,
(08:33):
to bringing drugs to market fast. And looks like it's
signing to work.
Speaker 2 (08:36):
So what does this mean for the US?
Speaker 1 (08:38):
Exactly? Well, these Chinese drug makers are now developing drugs
two or three times faster than most other countries. They're
still a long long way off. Like the twenty largest
farmer companies in the world don't include a single Chinese one.
The top ten are very much dominated by the US.
But you know, it's another area where strategic competition is
(08:58):
starting to make itself known. And crucially, one in four
generic drugs consumed in the US, like tylan Nol has
ingredients that come from China to manufactured in China. So
you can see this real point of leverage brewing both
with access generic drugs, but also China can overtake the
US as the drug innovator record. It's kind of another
(09:20):
area where the US ability to create the most valuable
IP in the world is under threat.
Speaker 2 (09:26):
So as I want to bring you a story that
really pissed me off, which is not something that I
share all the.
Speaker 1 (09:32):
Time year that's time I heard you say something that yeah, sure,
well you.
Speaker 2 (09:35):
Know, my therapist says, I'm not quick to express anger.
But this is this was actually a story I read
and I felt like viscerally angry about it. So it's
from the MIT Technology Review and it's about how private
prisons are now training AI on prisoner phone calls.
Speaker 1 (09:51):
What does that mean?
Speaker 2 (09:52):
You know, when you make a phone call from prison,
there are two companies that essentially allow you to do that.
Speaker 1 (09:57):
And they charge you like a lot regions.
Speaker 2 (10:00):
So it's a huge amount of money.
Speaker 1 (10:01):
That's right for the privilege or someone even say, the
human right of being on to talk to your family.
Speaker 2 (10:05):
That's exactly right, do you know how big this business is?
Speaker 1 (10:10):
I know there's a despicable number of people in cost
raids in the US.
Speaker 2 (10:14):
The inmate calling system is a one point two billion
dollar business.
Speaker 1 (10:18):
Just the calling system, that's correct.
Speaker 2 (10:20):
The story focuses on a company called securest Technologies. They're
one of two companies that specialize in the prison phone system.
The thing that really caught my attention in this article
is that Securist has been investing in voice recognition products
since twenty nineteen and AI products since twenty twenty three,
which means they've been training their own lms on these
(10:42):
phone calls.
Speaker 1 (10:43):
So they own the basically the whole batch of recordings
and can use it for whatever experiments they want to run.
Speaker 2 (10:48):
I mean, the experiment that they apparently are running is
building large language models that are going to help law
enforcement track and basically predict ifs are going to happen
on the basis of what people are talking about in
their prison phone calls. It's surveillance on a level like
you know, you think about panoptic power, and this is
(11:12):
like the panopticon having another reason to be a panopticon. Yeah,
so we know that llms are being piloted in certain markets,
but we don't know which markets. And actually, the securest
president Kevin Elder told the MIT Technology Review that securists
monitoring efforts have helped disrupt human trafficking and gang activities
(11:32):
organized from within prisons, among other crimes, and said its
tools are also used to identify prison staff who are
bringing in Contraband what.
Speaker 1 (11:41):
Do prison advocates, I mean, there was a potcast. They
hate it.
Speaker 2 (11:46):
They hate it. A woman named Bianca Tileck from the
organization Worth Rises, I think set it really the best,
which is she said, there's literally no other way you
can communicate with your family. Not only are you not
compensating them for the use of their data, but you're
actually charging them to collect their data.
Speaker 1 (12:02):
Yeah, I mean it does kind of stick in your
crawl a bit, doesn't it.
Speaker 2 (12:08):
It sticks in my cry, you could say, yeah, I
mean what infuriates me is that if inmates want to
communicate with the outside world, they have to play within
this system. Like they don't have a choice of how
they're making phone calls, and so while they're informed that
their messages are being recorded, they aren't informed about how
this data is used at all.
Speaker 1 (12:27):
Now, do you think the reason this story strikes us
both is because we are such sort of empathetic people,
or is it because in some sense it's a grotesquely
magnified pastiche of the experience of being a citizen in
the twenty first century.
Speaker 2 (12:47):
I do think that yes, this is a magnified version
of a surveillance capitalist state that we all live in,
that we experience on a daily basis giving away data
for free.
Speaker 1 (13:00):
I guess the difference is that we technically theoretically can
use duc dot go instead of Google and not right.
Speaker 2 (13:06):
We have tools that help us dog GPN.
Speaker 1 (13:09):
We have a choice which most of us don't exercise.
Speaker 2 (13:12):
Choice within our non choice exactly.
Speaker 1 (13:14):
Whether this is like, you know, well, your choice is
not to communicate. I guess yeah.
Speaker 2 (13:18):
And it's just I understand I suppose I understand the
need to stop human trafficking, for example, or I have
the understanding that it's important to stop you know, drug trafficking,
or you know, a legal contraband coming into a prison.
This just seems like the easiest, most crass way to
(13:42):
possibly surveil people.
Speaker 1 (13:44):
I also think, you know, it's such an easy thing
to productize and say, like put these super predators in
prison listening on these on their calls, and then this
infallible technology AI tool will tell you who's the baddest
of the bad And it's just like really, It's.
Speaker 2 (13:59):
Also like, is this where we need to be applying
AI tools? Like that's I just feel like money?
Speaker 1 (14:04):
What about rehabilitation?
Speaker 2 (14:05):
Rehabilitation programs? Precisely so, while I don't think twice about
accepting cookies or giving away my data, I do at
least have the choice to care without completely cutting myself
off from the outside world.
Speaker 1 (14:17):
Yeah, if I think you make a good point, which
is like you can support the goals of law enforcement,
like we don't want human trafficing all these things you've mentioned,
and still have questions about this pick particular technique. I
mean to me, the kind of framing question which we're
going to come back to after the break in the
story I'm going to tell you is philosophically, would you
rather live in a world with no privacy or in
(14:40):
a world with no crime?
Speaker 2 (14:41):
That's a good question.
Speaker 1 (14:42):
We're going to come back to it, and we're also
going to talk about the world's youngest female self made
billionaire and why people are getting addicted to giant fruit
and we're back. Do you know the phrase one nation
(15:02):
on the CCTV.
Speaker 2 (15:03):
I know the phrase one nation under God.
Speaker 1 (15:05):
That's one nation on the CTV was another contemporary artist
who was the People of the.
Speaker 2 (15:11):
Pre NFT era Banks Banks c Yes.
Speaker 1 (15:14):
This phrase was graffited by banks outside London Post office
in two thousand and eight. According to conservative newspaper in Britain,
the Telegraph, the UK could be on a path now
to becoming one nation under AI enabled CCTV thanks to
a new proposal from the Labor government.
Speaker 2 (15:34):
Say more about this.
Speaker 1 (15:36):
So, according to Telegraph, which is not a fan of
the Labor government, less sweet and just put our cards
on the table. Yeah, Labor is proposing that police be
allowed to compare photos of crime suspects from CCTV doorbells
and dash cams against facial images on government databases, including
the passports of forty five million Britons.
Speaker 2 (15:58):
What yeah, So why are they proposing this?
Speaker 1 (16:02):
Well, it comes back to the story we were talking
about before the break, right, Like they're proposing this because
they say that having CCTV cameras everywhere in Britain which
are hooked into the mainframe of the passport photo database
will allow it to be possible to stop crime in
real time, to take sexual predators off the streets before
(16:23):
they offend again, etcetera, etcetera, etcetera. But some are saying,
I think, frankly, including me, hold on a second, when
I wanted to get my passport, I didn't consent to
having my face be available for a live panoptic prison system.
Speaker 2 (16:38):
Well, very similarly to what we were saying is that
when prisoners consent to make phone calls that are being recorded,
I can't imagine that they are thinking these recordings will
be used to train lllms.
Speaker 1 (16:48):
And there's all these problems of bias in these facial
recognition systems, which we start talking about. What we've talked
about it to a blue in the face, but haven't
really been addressed or properly addressed, which is, if you're
a person of color, you're much well ke's be misidentified.
But again there's this like technological logic where because it's
tech enabled, people believe it's true. And it brings me
(17:08):
back to that question about like, yes, in theory, like
I would like people who are dangerous criminals to be
very quickly taken off the streets for the event. But
do I want the police to have unfettered access to
the whole passport database? I mean, in a just society
where you have confidence in the government, a democratic government
(17:31):
whose values you share, maybe this is okay. Maybe, But
what happens when this falls into the wrong hands.
Speaker 2 (17:39):
Right exactly exactly?
Speaker 1 (17:41):
You know, there's just there's so many ways to turn
this into a system for extreme evil.
Speaker 2 (17:47):
I think, and we saw it happen with Wigers, with
the weaker minority in China, Like it can happen very easily.
It's not like it hasn't happened easily. There was an
entire system that the Chinese government had called ijop, which
is essentially to collect data from tons of sources that
are being used by average Chinese citizens. So it's it's
(18:07):
not crazy to imagine.
Speaker 1 (18:09):
No, I mean, I think it's you know, the irony
of this whole story is ten years ago we were saying,
oh my god, China has built this surveillance stage and
official recognition of our life. I'm not awful. It's just
an ironic moment where our political systems are under so
much strain that we have this aspiration to things that
filled us with horror alto recently.
Speaker 2 (18:28):
Yeah, so let's have a question for you. How many
female billionaires can you name?
Speaker 1 (18:35):
Is Nancy pelosia billionaire? You know there's a fund you
can use to track Nancy Pelosi's trades and make them yourself.
Speaker 2 (18:44):
Really, Oh, I have never seen that on the train.
Speaker 1 (18:47):
Okay, but but Nancy Pelosi did not.
Speaker 2 (18:49):
Nancy is definitely not one I trolling. So this week
I was trying, like I was trying to think, like,
who's a female billionaire? Who's a female billionaire? And I
was thinking about it because the world has a new
billionaire and she just happens to be the youngest female
self made billionaire to date.
Speaker 1 (19:06):
Wow.
Speaker 2 (19:07):
And for the sake of our show, she works in technology.
Speaker 1 (19:10):
Wow. I'd love to have her own who is?
Speaker 2 (19:11):
She told me we should we should have her on.
Actually she's really interesting. So last week, you know, we
talked about polymarket, which is the online prediction market that
continues to kind of discuss me. But the woman who
is now a billionaire co founded poly Market's competitor, which,
as you know, is caw she that's.
Speaker 1 (19:27):
Correct, which sounds like a breakfast here like used to
eat called kashi.
Speaker 2 (19:30):
But I remember Cashi made me so ill. You would
eat Cashi. I love that cereal. So Calshi is similar
to poll Market in that people can make bets on
everything from you know, sports, to elections to the weather.
But unlike poly Market, which was crypto based, Cawshi is
based in the US dollar and received approval from the
CFTC the Commodity's Futures Trading Commission in twenty twenty. Poly
(19:55):
Market received the approval in September twenty twenty five, but
only after being fine I had one point four million
dollars for operating in unregistered markets.
Speaker 1 (20:04):
You sounded like doctor Eva in Austin powers four million dollars. Interesting.
So cow She recently caught my eye because they announced
a partnership with CNN. That's right where Cina is basically
gonna broadcast caw she live predictions on things which are
going to happen, which some people within CNN were pretty
(20:24):
upset about, because you know, the idea of of of
journalists becoming sort of jockey and like, you know, odd
to make Betty.
Speaker 2 (20:32):
What is that called insider trading that potentially called.
Speaker 1 (20:35):
It sounds like it could be. I want to know
more about this female founder who is she? What's her name?
Speaker 2 (20:39):
Her name is Luana Lopez Lara and her co founder
is named Tarik Manser. She's trying nine and Luana's backstory
is like something out of a Russian spy novel. She
was trained as a professional ballerina in Brazil extreme discipline,
and even turned pro in Austria for nine months, but
then she was like, I'm out of here. I don't
want to be here anymore, and went to study computer
(21:01):
science at MIT, and that's where she met Tark Wow.
Speaker 1 (21:05):
And they founded this company, Kalshi together.
Speaker 2 (21:08):
They did I like this from a piece that I
read about it that he basically noticed that she was
sitting up front in class and was like, that's the
girl who's going to be a billionaire. Yeah, exactly. But
according to their website, both Luana and Taric worked at
financial institutions like Goldman Sachs and observed that many financial
decisions were driven by predictions about future events. And then
(21:28):
they thought it was odd that there wasn't a straightforward
way to trade directly on event outcomes, so they set
out to create a place for this type of direct exchange.
Speaker 1 (21:39):
That's interesting you mentioned I think half joking that Polymarket
disgusted you. But we talked about this last week in
respect to people betting on like you know, the outcome
of Russia Ukraine battles over individual.
Speaker 2 (21:49):
Towns and Zelensky sue.
Speaker 1 (21:51):
But I said to you then, which I will repeat now, like, yes,
that is in a very poor taste. But to what
you just told me, it democratizes the types of bets
that government and banks in a sense already make and
just makes them very visible in consumer facing.
Speaker 2 (22:05):
That's right. There is a little bit of a like
let's take on the man energy to this whole thing.
I just think the story is really interesting because when
you think of tech billionaire, what do you think, like,
what's the model? Right? Man and sometimes boy for me, yeah,
but you don't think of woman or girl, not to
be so binary. But no, I think I agree with you,
you know, And so I think this idea of like
(22:27):
socialized bedding on the basis of like maybe what a
government is going to do, even though it's still mostly
sports focused, is really important for us to keep talking
about it.
Speaker 1 (22:38):
And also to your point about Calshi and Polymark, I mean,
these are two of the most valuable private companies or
very extremely valuable private companies that are not AI companies.
They're both valued, I think are over at ten billion dollars. Yes,
you want to know another unicorn that's been promising around
place my subconscious. It's a company called Fruitist, who make
giant fruit.
Speaker 2 (22:59):
It's so funny. There was a I don't know if
you ever saw this, but it's there's these kind of
bougie bodegas in New York City and there was a strawberry.
Oh yeah, I was astonished at the cost of this strawberry.
Speaker 1 (23:12):
It was a strawberry I think nineteen dollars something like that,
single strawberries.
Speaker 2 (23:16):
And what that tells me is there's a market for it.
But whatever, I'm sorry to interrupt you, Well, you want
to talk about.
Speaker 1 (23:21):
Founder of Fruitists actually directly pushes back on what you've
just said and said what I'm doing has nothing to
do with champagne strawberries. But let me read you the
headline from Fortune. Ray Dadio is backing a one billion
dollar blueberry unicorn that sells berries nearly the size of
golf balls. That's this huge. I want to live in.
Speaker 2 (23:40):
The fact that we can report on big fruit in
the same breath as companies literally using large language models
on the basis of prisoner phone calls. Is where technology
is truly the craziest world to where we.
Speaker 1 (23:52):
Have the poorest taste of anyone in the world.
Speaker 2 (23:54):
Possibly.
Speaker 1 (23:55):
So when I first saw this was, oh my god,
they made GM fruit wrong. These actually normal fruit, but
super optimized by data. They've bought farms around the world.
They've used AI to basically totally optimize the production, distribution, storage, etc.
Of the blueberries. Founders Steve mcgamy has said, quote they
(24:18):
actually pop when you bite into them.
Speaker 2 (24:21):
So we have gotten to a point in our culture
where we literally can't stomach having any uncertainty around a
blueberry like we need we need to actually use data
to make sure that our blueberry is perfect exactly.
Speaker 1 (24:35):
That's exactly. And it's not just blueberries. There are raspberries
and blackberries in development, and Fruits has stuck a cherry
deal in China.
Speaker 2 (24:44):
I don't even know what that means.
Speaker 1 (24:46):
To your point though, about consistency, that is exactly the goal.
Ceomagami refers to ending the problem of quote berry roulette,
meaning no more inconsistency in berriers. And we can laugh.
But fruit is sales have tripled in the last year,
so to passing over four hundred million dollars. The company
was considering going public, but with tarists they decided to
(25:06):
pause that.
Speaker 2 (25:07):
Can I just ask a question, where do you buy
these fruit like everywhere like Whole Foods? But is the
brand called fruit Ist or are they.
Speaker 1 (25:14):
Say fruit is but actually they it doesn't. It's not
like the nineteen dollars strawberry. It comes in regular packaging.
I see. So you can get like a pack of
like golf ball sized blueberries which is like eight dollars
for ten, let's say, which is still expensive, but it's
different from nineteen dollars for one. You know.
Speaker 2 (25:33):
I think one of the interesting things about this piece
to me, and people have been writing a lot about
this visa VI restaurants and menus, is that like GLP
ones and Maha culture have really influenced what people demand
in terms of the products that they eat or buy.
Speaker 1 (25:50):
One hundred percent. So and this is you know you're
seeing from Ceo mcgami's hymn book here because he said
these are not berries to go on your muffins or
your oatmeal. These are stand alone, snackable, berriessable berries. All
other snack categories are seriously down since the launch of
mpig and buried I think up so, yeah, I like
these stories that have all the different Yeah, all the
(26:11):
different things we talk about come together in a golf
ball size blueberry that pops when you bite into it.
So this week for chatting me, I'm bringing us a
story that I heard at a conference last week called
(26:32):
the Doha Forum. And yalde Hakim, who is the lead
world news presenter for Sky News in Britain, was there
and she was moderating a panel. But she opened it
talking about a very strange experience of going viral for
all the wrong reasons.
Speaker 3 (26:48):
So this interview was shared over a million times. I
was seeing the interview itself just going completely viral within
twelve hours. Then my producer sent me a message saying
that this is the fake version of the interview.
Speaker 2 (27:02):
What do you mean? I thought she recorded an interview.
Speaker 1 (27:04):
She did, but the interview, not just the person's responses,
but her questions were both deep faked.
Speaker 3 (27:11):
What terrified me is the fact that these deep fakes
have become better and smarter. For the last seven or
ten years, we've been hearing about deep fakes, and what
they could do to society and you know the impact
it's going to have. But I suddenly saw something that, frankly,
if you didn't know my voice, if you didn't know me,
and if you didn't know my mannerisms, you would think
(27:33):
that that clip was.
Speaker 1 (27:34):
Real and it wasn't any old interview that was being
deep faked. Yalda's original conversation was with the sister of
the former Prime Minister of Pakistan, Imran Khan, and he's
been in jail since twenty twenty three on charges of
corruption that his supporters say are all politically motivated, and
the conversation was about that and how he's being treated
in jail. The way it was manipulated, however, was much
(27:55):
more provocative, much more potentially dangerous, and that's what went.
Speaker 3 (27:58):
Viraltions that I had asked her had been completely changed, fabricated, manipulated.
The focus was aggression towards India and an attack on
the Army chief Arsimunir of Pakistan. And I think what
was terrifying is the fact that we know that India
and Pakistan went essentially to war in May. They went
(28:21):
head to head following a terrorist attack in Kashmir. There
was concern, you know, when two nuclear armed states are
on the brink that way, that this could escalate and explode,
not just regionally, but internationally. And this was further fueling
the flame to the point where the Defense Minister of
Pakistan responded to the interview the fake version as though
(28:41):
it was real, and mainstream media in India picked it
up and it made headlines across the country.
Speaker 2 (28:47):
This is what experts have been warning us about for years,
that deep figs can be used to actually escalate tensions
and break down trust between people and their leader. I mean,
you think about we have a US president who reposts
deep fakes all the time.
Speaker 1 (29:01):
Well, that's right, and we've become accustomed, i think, because
of our president, to viewing deep fakes as kind of
memes on steroids or comedy. Yeah, and almost a little
bit desensitized to the kind of threat that people have
been mourning of for years, that they could be used
to manipulate public opinion and cause political catastrophe. And that's
why I wanted to get Yelder on the show to
(29:22):
talk about this, because it seemed like this deep fake
really was taken seriously by major politicians in two countries
that are nuclear armed and have been earlier this year
on the brink of war.
Speaker 2 (29:36):
So how did she actually rain in the deep fake interview?
Speaker 1 (29:39):
Well, it's a bit of a to do, but she
was able to use her platform to indeed ring in.
Speaker 3 (29:44):
The way that I dealt with it was first of all,
putting a post up on my social media saying that
this deep fake is fake and it's AI generated. That
also got picked up. I also did some interviews on
sky News platforms. They ran the original and the fake version,
and I talked around it and clarified. But I think
what is terrifying is the fact that this really does
(30:08):
test our democracy, journalism, our work, and how we're going
to have to deal with things in the future.
Speaker 1 (30:14):
This is quite a dramatic story of the intersection of
humanity in AI and not something that most of us
encounter in our everyday lives. But we do want to
hear about your everyday lives and your interactions with technology.
How are you using chatbots, how you're interacting with synthetic media.
Anything you can share with us about how your life
is being changed in real time by your interaction with
(30:36):
new technology is what we want to feature on this show.
So Please write to us at tech Stuff podcast at
gmail dot com.
Speaker 4 (30:42):
Will feature your stories and we'll send you a free
T shirt.
Speaker 2 (30:55):
That's it for this week for tech Stuff.
Speaker 1 (30:57):
I'm care Price and i'mos Voloshian. This episode was produced
by Eliza Dennis and Melissa Slaughter. It was executive produced
by Me, Carol Price, Julian Nutter, and Kate Osborne for
Kaleidoscope and Katria Novel for iHeart Podcasts. The engineer is
Behid Fraser and Jack Insley mixed this episode. Kyle Murdoch
wrote out theme song.
Speaker 2 (31:15):
Please rate, review and reach out to us at tech
Stuff podcast at gmail dot com. We want to hear
from you.