All Episodes

December 30, 2024 • 12 mins
"Suchir Balaji help to gather and organize the enormous amounts of internet data used to train the startup's ChatGPT chatbot. " - New York Times

On this podcast we review the latest article as it relates to current events following the suspicious death of Suchir Balaji.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Hey, what's going on you guys?

Speaker 2 (00:02):
This is Joseph Bonner and you are listening to the
New York Times Review.

Speaker 1 (00:09):
Welcome to the show. Today, we're gonna actually be reviewing
a New.

Speaker 2 (00:13):
York Times article that was published back in October twenty third,
two thousand, twenty four by uh Katie Meant. It was
an article that had to do with open Ai. The
title of the article is former open ai researcher says
the company broke copyright laws.

Speaker 1 (00:33):
The article has an interview.

Speaker 2 (00:36):
With Sachi Blagie, who is the well they they call
him the whistle lower. But I'm gonna be honest with
you as I read the article, I don't really I
don't really consider him a whistleblower. And I'll explain more
about that as we go through the article. But Sachira
used to work on open ai and he recently was

(01:00):
found dead in his.

Speaker 1 (01:04):
Living unit.

Speaker 2 (01:05):
Now police and investigators are saying that he died.

Speaker 1 (01:09):
Of a suicide.

Speaker 2 (01:11):
His parents, after looking at the photos from the scene
of the alleged suicide, are.

Speaker 1 (01:16):
Not convinced that it was a suicide.

Speaker 2 (01:20):
Because they they and they said that he was not
suicidal and there was no note left.

Speaker 1 (01:25):
So the mom was like, it doesn't seem like it's
a suicide to me.

Speaker 2 (01:30):
So I thought it was very important to kind of
go back over the article that landed such year in
the the media spotlight.

Speaker 1 (01:38):
Initially, which was the article published by The.

Speaker 2 (01:40):
Million Times back in October twenty third, the article that
we're now going to be reviewing now, the article.

Speaker 1 (01:48):
Now published by The New York Times.

Speaker 2 (01:50):
You guys should just know this going into this article,
that the New York Times believes that open ai is
violating copyright laws and they actually have aim lawsuit against
open Ai. So when since Cher reached out to them,
and that that's what happened. He actually reached out to
The New York Times, no doubt, after finding out that
they were in you no, litigation with open Ai and

(02:12):
deciding to leave the company. He wanted to kind of
give them his take on the situation, which is what
he mentioned on social media. So he wanted to kind
of give them his perspective on.

Speaker 1 (02:23):
Open Ai.

Speaker 2 (02:25):
The article does go into you know, what the laws
are in regards to you know, fair copyright use, but
and that the New York Times was suing open Ai,
and that was back on it was publishing Auto, but
they were going to be suing them this month in December. No,
it's the end of December, so they probably already filed
the lawsuit. And again the accusation is that open ai

(02:51):
is in violation of copyright laws, whereas open ai says, no,
we are still we still fall under the fair the
fair use law because we're not in direct competition with
anybody who wrote you know, we're just training our ai
to to pick up you know, wording phrases like typologies,

(03:15):
and therefore we are presenting a completely different product than
what the original work was, and we're not competing with uh,
you know, the.

Speaker 1 (03:26):
Original writer or composer of the work.

Speaker 2 (03:29):
Well, Sa Sachi did not agree with that. He actually
believes that there there was a competition with creators when
their work is being used to train open ai, which
is why he feels that his.

Speaker 1 (03:43):
Company was in violation of copyright laws.

Speaker 2 (03:46):
The reason why so reading over this article, it's it's
a very simple I'm gonna be honest, it's a very simple.

Speaker 1 (03:51):
Article, you guys.

Speaker 2 (03:52):
It doesn't really get into too much of the law
of you know, fair copy right use as well as
you know, copyright laws.

Speaker 1 (04:00):
It doesn't it's not it's not a it's not.

Speaker 2 (04:02):
A legally analytical article. So it's really it's really opinion based.
And and keeping in mind to that, mister Balagi is it.
You know, he was a you know, one of the
persons who worked on the app.

Speaker 1 (04:18):
He he's not a law student.

Speaker 2 (04:20):
All he gave in the article was just his interpretation
of how he felt they were violating laws.

Speaker 1 (04:27):
I mean, so it's a very simple article.

Speaker 2 (04:31):
It doesn't really push one ideology ideology over another, and I.

Speaker 1 (04:38):
Just find it very The reason why I wanted.

Speaker 2 (04:40):
To do a review of this article is because it's like,
to me, this is like, this is nothing to murder
somebody over if that makes any sense, Like I couldn't.
I just it doesn't make It wouldn't make any sense
to me if if somebody who was associated with open
AI read this article and was like, hey, we have
to get in this whistle bower.

Speaker 1 (05:01):
It's it's his opinions are not that strong.

Speaker 2 (05:04):
I mean, his points were not that like solid to
where somebody would want to take him out, at least
in my opinion. You know, we live in a very
crazy world though, and there and I will say this,
though the article does mention that you know, in regards.

Speaker 1 (05:22):
At least the the.

Speaker 2 (05:24):
Stance that open AI takes in regards to, you know,
their use of copyright material to train their AI. They
say they're not in direct competition. And I would say
this when they were a nonprofit, probably not. I think
I think their argument was solid when they were a nonprofit,

(05:48):
But now that they are no longer a nonprofit, I
do believe that mister Biology does have a point at
this point. Now he does because now it's a it's
a profit organization that was no longer nonprofit.

Speaker 1 (06:01):
And so although they were founded on the grounds of.

Speaker 2 (06:06):
You know, creating an environment or a product that was
useful for everybody, free to use for everybody, and you
know that it wasn't profit, wasn't the objective of the
company when they first started. That has since changed. So
I'm not sure that their argument that they used or

(06:29):
that they used to start the company stands today in
twenty twenty four. So maybe that could be a reason
therefore to want to take somebody out.

Speaker 1 (06:37):
I don't know.

Speaker 2 (06:38):
I just personally didn't feel that mister Bilogi's arguments were
so strong and like so concrete against open ay that
they would want to take them out. The only reason
why I would see a company wanting to take someone.
I'm not saying that they did, obviously I don't know,
but I'm saying I'm just trying to understand, Like because
you know, the parents say his death looks suspicious, and

(07:00):
then at the same time, you know, the investigator say
it was a suicide. But I'm gonna be honest with you,
it does not seem like it was a suicide.

Speaker 1 (07:08):
It does not seem like it was a suicide at all.

Speaker 2 (07:10):
If the voice, especially from his recent post on social media,
he had no plans on going anywhere for a very
long time.

Speaker 1 (07:18):
I can tell.

Speaker 2 (07:18):
You can tell when people write and when they speak,
especially when they make moves like reaching out to the
media in this particular instance, those are not people who
are who plan on going anywhere anytime soon. So the
whole suicide bit that that is very sketchy. It's just
surprising to me that he would even be considered a

(07:40):
threat to the company. Like really, So I'm not again,
like I'm saying, I'm not saying that an open AI
put a hit out on him.

Speaker 1 (07:48):
You know, I'm not saying that. But if somebody from
the company did.

Speaker 2 (07:54):
It was like, really, it wouldn't make any sense because
his arguments weren't that strong. And if you're doing something
illegal or fraudulent as a company, chances are somebody else
is going to bring it to light.

Speaker 1 (08:06):
Anyways.

Speaker 2 (08:07):
It's like the things always come to light eventually.

Speaker 1 (08:10):
They you know, they always come to light.

Speaker 2 (08:13):
Look a look at the case of Diddy, right, things
will eventually come to light, and they take ten twenty years.

Speaker 1 (08:18):
Eventually people are going to find out. That's just how
it works.

Speaker 2 (08:22):
I did think the article was okay written when I
would say not well written, only because you know, obviously
New York Times has a dog in this fight, and
you can and you can tell from the writing of
the article that they do have a dog in the fight.
And I like New York Times articles when there is
no dog in the fight and they can report on
it from just a journalistic perspective. You don't get that

(08:43):
leisure here with this particular article because they were going
to litigate with the company in December, when this written article.

Speaker 1 (08:49):
Is written in October.

Speaker 2 (08:50):
So unfortunately, you don't feel the New York Times quality
that you would normally feel with other articles, just because
you know there's always going to be a question that
how genuine are you when.

Speaker 1 (09:02):
You publish this because you have a dog in the flight.

Speaker 2 (09:04):
It's just it's it's it's almost like a conflict of
interest for journalists to publish on things that have to
do directly with their company just because it automatically will
question their journalistic integrity. Whether it shouldn't be questioned or not,
that's regardless of the point, but the question, the fact
is it will be questioned because they have a dog
in the flight. I will say the article though, was
written okay. There weren't There weren't any inaccuracies in it.

(09:26):
There weren't any dogma pushing tactics in it from what
I could see. So that's why I say that the
writing was okay. I just felt really bad for mister
Bilagi that he that he's no longer alive.

Speaker 1 (09:38):
I do not think he killed himself.

Speaker 2 (09:40):
I do not believe it was a suicide, And the
fact that authorities are saying that it was is why
there's red flags.

Speaker 1 (09:46):
Now.

Speaker 2 (09:46):
You know, they would have just said he died. It
would have been like, oh, okay, he passed away. They
would have said that.

Speaker 1 (09:50):
Under don't we don't know how it happened.

Speaker 2 (09:52):
I would be like, okay, but because they said it
was a suicide, you know that something's not right. It
does look like the company that once was associated.

Speaker 1 (10:04):
With him did something. It looks that way.

Speaker 2 (10:07):
Circumstantial surroundings make it look like Open AI took him out.

Speaker 1 (10:13):
That's just how it looks.

Speaker 2 (10:15):
Uh. And they should know that. They should know that's
how it looks. And I hope that it gets investigated thoroughly.
Obviously well, the FBI and the President of the United
States have been called on to investigate this.

Speaker 1 (10:29):
Because mister Biology does make.

Speaker 2 (10:30):
An interesting point at the end of the New York
Times article where he actually indicates what should it be
done in regards to AI and legalities for the future.
And I'm not sure if things like that are already
in the works, and if that could be incentive for
u uh a uh ah, a billion dollar corporation to

(10:53):
wanna take somebody out again, this is all speculation and
all possibilities.

Speaker 1 (10:58):
Again, I don't know.

Speaker 2 (11:01):
I just it just feel really bad for the young
man because I'm pretty sure that's not what he intended
when he was starting out on this, you know, this
journey to tell the truth, or at least to tell
his truth to be taken out over just a simple opinion.
And his opinion was very simple. Like I said, he
didn't provide any hardcore facts.

Speaker 1 (11:21):
He just said this is how I feel. That was it.
There was nothing solid against open Ai, So.

Speaker 2 (11:26):
It's just it's very interesting. I will say that at
any rate, you guys, I do hope you guys all
stay tuned to what's going to be happening with this story.
We do hope that mister Bilagi's family gets some answers
that they rightfully deserve, and that the FBI and the
President do investigate this matter. If open Ai has anything
to do with his death, I do hope that it

(11:48):
is ascertained and discovered quickly in the New York Times.
I'm gonna be honest with you guys, and you guys
put this article out there.

Speaker 1 (11:54):
I really hope you are reaching out to the.

Speaker 2 (11:56):
Family and covering his funeral expenses, because that's the least
you guys can do since you published this. Boy, and
you guys know as journalists how dangerous it is dangerous
it is for individuals to come out against multi billion
dollar companies.

Speaker 1 (12:13):
You guys know the dangers involved.

Speaker 2 (12:15):
I don't think mister Bilagi knew what he was getting
in getting it, getting into I don't think. I don't
think he knew. So I do hope the New York
Times at the very least covers his funeral expenses. All right,
you guys, this is Joseph Bonner with the New York
Times Review, where again we review the New York Times
articles that matter to you the most. Until next time,
you guys, please be saved and take care of each

(12:35):
other and watch you back. I will say, watch you're back,
and trust.

Speaker 1 (12:42):
No one because this world is crazy.

Speaker 2 (12:47):
And if you got your truth to tell, tell your truth,
but make sure that you're safe at the same time.

Speaker 1 (12:51):
Okay, all right, you guys, take care
Advertise With Us

Popular Podcasts

Stuff You Should Know
New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.