Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media, Ah, God is dead and you, Jamie Loftus,
have killed him.
Speaker 2 (00:12):
I did it. I finally did it.
Speaker 3 (00:13):
You did it. You did it. Now God was a
hymn and Jamie killed him hammered in the back of.
Speaker 2 (00:18):
The head, and people are going to be critical of that.
But you know, you don't know my story. And in
my sixth part mini series, in which I'm played by
Amanda Stefried, We're going to start to see my side
of the story. And I'm definitely not going to jail
for what i did.
Speaker 3 (00:34):
That's good. Unlike founder of Elizabeth Holmes, who we just
found out has been sentenced to eleven point two five
years in prison. I'm kind of like, yeah, I have
mixed like I have.
Speaker 2 (00:49):
Mixed feelings because it's like people don't go first of all,
the prison industrial complex in general.
Speaker 3 (00:56):
No, it's it doesn't make anyone better to the extint
that there's value in the present day and putting people
in prison, it's people who are like a severe ongoing
danger and I don't see this making anything better. Like
at the same time, I hate her, so I don't
I'm not gonna it's not going to be the top injustice.
(01:16):
I rue today, I.
Speaker 2 (01:19):
Do kind of like that her. Uh, after being exposed
as an unrepentant criminal, she's like, Oh, I think I'm
just going to kind of be like a normy girl
for a while.
Speaker 3 (01:30):
And I'm I'm going to go normal and have a kid.
Speaker 2 (01:32):
Yeah, And you're like, Liz, it's too late. It's too
late for that, Liz.
Speaker 3 (01:37):
You defrauded people with a fake medical device that led
folks to see to get treatment for things they didn't
have and ignore illness as they did, which is.
Speaker 2 (01:46):
Bad, but before too many bodies hit the floor. But yeah,
that would have stopped.
Speaker 3 (01:50):
You, but you didn't really care. You applied Steve jobs
logic to something that was not just a silly box
to keep in your pocket.
Speaker 2 (02:00):
Lizzie, Lizzy, No, Lizzie, uh, Lizzy.
Speaker 3 (02:06):
Again we learned also putting her in prison for you know,
probably nine years, when you consider all of the other things,
is like not gonna help anything. It'll just mean that
that kid she has grows up without a mom for
nine years, and that's not gonna make that.
Speaker 2 (02:22):
It's like, it's a huge moment for many things can
be true at once. Having to hold all of those
truths and still record an episode of behind the Bastards.
I can tell you about this.
Speaker 3 (02:34):
Actually this actually will be relevant to the episode, but yes,
please please.
Speaker 2 (02:38):
Really yes, Okay, this is definitely not going to be relevant,
so let's let's okay with it. I was a legal
case I was thinking about today. Was the Beanie Baby's
billionaire when he went was taken to court in twenty
thirteen for holding money in a Swiss bank account.
Speaker 3 (02:56):
He so it was like a tax illegal, oh, a
tax abasion, taxivation? Try.
Speaker 2 (03:00):
Yeah, So he was up for as many as five
years in prison for tax evasion, and he got off
with I mean, he's a billionaire, he's never going to
suffer a consequence, right, But like, he ended up getting
a two years of probation on the ground that it
had been too publicly humiliating so he didn't have to
(03:21):
get to go to jail because he was too and
so farnest it was so embarrassing that he didn't have
to go to jail. What that's absolutely fucking weird.
Speaker 3 (03:31):
Anyways, I'm gonna I'm going to see how far this
goes by committing murder and then having my pants fall
down and like, well judged, look, yes I did stab
that man. Forty seven times, but then everybody saw my underpants.
Speaker 2 (03:45):
Yeah, pee pee myself.
Speaker 3 (03:47):
So I feel like that where Yeah, can we just
zero this one out?
Speaker 1 (03:55):
Man?
Speaker 2 (03:55):
I love the Beanie babies story so much. I'm surrounded
by my beans feelings.
Speaker 3 (04:00):
Wow, that's good. You do literally have one on your
shoulder right now, just like I have this rifle next
to me.
Speaker 2 (04:09):
I think that we both have our comfort objects at
the ready, right.
Speaker 3 (04:14):
So, Jamie, speaking of Elizabeth Holmes, because the person we're
talking about today is going to be the next story
like that. By this time next year, there probably is
going to be an HBO documentary about this guy. Oh God,
maybe Taylor Taylor Kitch could probably play him, actually, if
he wanted to. Taylor Kitch played hot David Koresh in
(04:37):
the Waco Show. I walked into that one. Yeah, they'd
have to give him like a belly suit or something.
That's not anti fadom, just being accurate, but he could
do it. Maybe it is.
Speaker 2 (04:48):
I don't want to see this man ever again. No,
I'm mad Taylor Kitch.
Speaker 3 (04:52):
You don't want to see Taylor Kitch again. You don't
want to see those cum gutters again. Unbelievable.
Speaker 1 (04:56):
God, damn it, rock.
Speaker 2 (04:58):
There's plenty of in this town. Robert, there's a million
com gutters. I don't need those.
Speaker 3 (05:04):
Oh that that is true. But anyway through the street
that that is. That is true of Los Angeles and
no other city. Today we are talking about a guy
who absolutely never comes Sam Bankman Freed. Why do you
know this guy? Do you know this guy?
Speaker 2 (05:22):
I don't know this guy.
Speaker 3 (05:23):
You don't know this guy. You don't know this guy?
Have you? Have you caught any news in the last
week about how like a massive cryptocurrency exchange has collapsed plummeted?
This is that guy. Oh, this is the guy with
his like polyamorous sex ring that was running a big
crypto bank in the Bahamas and it all fell apart.
Now billions of dollars are gone.
Speaker 2 (05:44):
You've lost me again.
Speaker 3 (05:46):
Oh great, Okay, well I'll try to This is still breaking.
We are because this is a Thanksgiving week episode. We
only do one episode on Thanksgivings. I needed a single one,
so I just want to give everyone background on this guy.
We will we may come back to this story because
there's a lot we don't fully understand about how he
did what he did and the degree to which but
(06:07):
I think this guy. This guy ran a trading service
called Alameda Research and a crypto exchange. An exchange is
basically like a bank, right, It's a cross between a
bank and like a trading platform called FTX, which was
one of the largest cryptocurrency exchanges in the world, and
(06:27):
was also considered by most people to be the most
stable and like ethical and legitimate. Right, people who were
just kind of on the outside looking at when everything
collapsed earlier this year, Right, you remember that we haven't.
Speaker 2 (06:38):
Been right, I was engaged in that. It's just that
I get a lot of my news, yeah journalists on Twitter,
and they've been busy this week.
Speaker 3 (06:46):
Yes, So when the when crypto like fell apart, when
a lot of crypto fell apart earlier this year, and
like a bunch of places went under, FTX was one
of the ones that stayed stable. And actually we're buying
up a bunch of like failing crypto companies to try
to like prop up the industry. They just collapsed and
like the value of all everything has been plummeting for
(07:07):
for the last several days. It's a big disaster. It
is very like yeah.
Speaker 2 (07:12):
In in in as in words that will annoy me
as little as possible. Can you explain why FTS remained
solvent and other.
Speaker 3 (07:22):
It is not solvent? Oh now why didst because they lied?
So they were they were operating the short end of
how to describe it, and we may there will be
more details to come, but at present it seems fair
to say that it was a giant Ponzi scheme where
they were they were they were taking in money, promising
(07:42):
unreasonable returns, using other investor money to gamble on stuff
to try to provide anyway, and it worked.
Speaker 2 (07:49):
Like all the other market guys, they were they were dishonest.
Speaker 3 (07:53):
Yeah, it is. It is very likely. What what what
differentiates this is the scale because this is very likely
a financial crime on the leftl of what Bernie made
off did. We are talking in the ten to twenty
billion dollars, stolen a lot of money. This is a
series financial crime. I'm going in on this pretty cold.
So for the other the other kind of mass of
(08:19):
like touchstone of this is that it has led to
a class action lawsuit against Larry David, Shaquille O'Neal, Tom Brady,
and a number of celebrities who were all in a
Super Bowl ad for FTX.
Speaker 2 (08:31):
I remember that ad Yeah, was so embarrassing for my
man Larry.
Speaker 3 (08:36):
Yeah. So the lawsuit is basic advertise basis was a
high dollar Ponzi scheme and you guys were using your
name recognition to sell on registered securities which they were,
which they were, which they definitely were.
Speaker 2 (08:52):
Sorry, I just want to circle back to Shaquille O'Neal
will put his name on anything, to.
Speaker 3 (08:56):
The point where to the very funny.
Speaker 2 (09:00):
I worked at a haunted hay ride this year, which
we don't talk about because it was a bad idea.
But the rival what you told, bad idea?
Speaker 3 (09:12):
I guess what.
Speaker 1 (09:13):
I'm alive, bitch, bitch, I lived.
Speaker 2 (09:15):
I lived to tell the tale. It's very unclear who
is right in the side of should I work at
a haunted hay ride or not? I still haven't really
landed on an answer point. Being our closest rival haunted
hay ride wise was Shacktoberfest. It was wow Shack themed
(09:35):
haunted attraction in which the only Shack related thing was
a gigantic inflatable Frankenstein that looked like Shack, which did
sound awesome.
Speaker 3 (09:44):
That sounds actually like the best time anyone's ever had
Shack will put his.
Speaker 2 (09:48):
Name on anything, including crypto and Halloween supportive of what
a What a King?
Speaker 3 (09:54):
Probably not. I'm sure there's horrible things about Shack that
have come out that seems almost unavoidable. Anyway, Sam Bankman
Freed is the guy behind this gigantic financial crime that
is still unraveling as we do this episode. And I
want to talk about less about what happened on the exchange,
because none of us want to talk about how somebody
carries out the nuts and bolts of a cryptocurrency scam.
(10:15):
But I want to talk about I want to talk
about the social elements of this scam. I want to
talk about how he conned the media, how he conned celebrities,
and how he conned regulators. And I just want to
talk about also the way some of these people talked
and wrote about him, because there's a lot about I
don't know. A week or so ago, we did an
episode on The Daily Show We do what could happen
(10:36):
here about ethical altruism, which is, in brief, a theory that,
like the instead of trying to help people just because
they need help, you should only help people after you
consider the way to help people. That is like the
absolute most beneficial way for like the least amount of effort.
It's utilitarianism, right, how can I do the greatest good
(10:58):
with the resources and ways? Yeah, and it's it's the
way a lot of these like, and it's merged with
this kind of thinking towards what billionaire types called long termism.
And the gist of this is, like, it's not worthwhile
for me to do stuff like pay taxes to have
a society or guarantee like universal health care. Instead, I
(11:19):
should make Instead, the most ethical thing that I should
do is make as much money as I personally can
and then put that money into things that I believe
will save the world, like research to stop AIS from
killing everyone and getting to Mars and shit. It's a
way for billionairess. It's a way for billionaires and the
other mega rich to justify like continuing to do exactly
(11:39):
what they want and feel like they're saving the world anyway,
Sam Bag, I'm free.
Speaker 2 (11:43):
The guy from Beanie Baby, you know, the beanie babies
billionaire did to improve.
Speaker 3 (11:46):
The world, made a lot of beanie babies, and.
Speaker 2 (11:50):
Then he bought the Four Seasons hotel and kept making
beanie babies. He didn't do shit.
Speaker 3 (11:54):
That's well, you know, that's I'm fine with that compared
to these guys because they're all doing the thing where
they're pretending anyway, Sam Bankman Freed is one of these guys,
and we're going to get into that. But we did
this episode and it could happen here where he was
kind of a tangential character in this very unsettling and
insidious movement that is behind guys like Elon Musk who
(12:15):
are claiming to be saving the world, will just fucking
over people. And then like four days after it came out,
his entire life unraveled and his fortune disappeared overnight because
he was a giant con artist.
Speaker 2 (12:24):
What a treat.
Speaker 3 (12:25):
It's very funny. So that's why we're talking about him
right now.
Speaker 1 (12:29):
Yeah, he's like thirty years old and looks like Mark
Zuckerberg and David Dobrick's love child.
Speaker 2 (12:34):
He's thirty years old.
Speaker 1 (12:36):
I feel great about myself. Was a Mark Zuckerberg and
David doe Brick's love child.
Speaker 3 (12:40):
Look, I shouldn't call anyone a schlub, but he looks
like a schlub.
Speaker 2 (12:44):
I forget what David Dobrick looked like because my brain
protects myself respectfully. I understand two villains, two villains, love child.
Speaker 3 (12:53):
Yeah anyway, so I uh yeah. Sam Bankman Freed was
born in nineteen at ninety two on the campus of
Stanford University, continuing a long and proud tradition of absolutely
nothing good ever coming from that hell hole. His parents
are both extremely prominent Stanford professors. His mother, Barbara, is
a lawyer her Cloak, who clerked for the Second Circuit
(13:16):
Court and graduated from Harvard. She founded Mind the Gap,
a somewhat shady and mysterious democratic fundraising group. I think
it's shady in that people don't exactly know where all
the money comes from or like what their goals are.
She also pinned in Shady Yeah Yeah. She also pinned
an essay in twenty thirteen that the right wing is
going nuts about because she was basically arguing that, like it,
(13:39):
good and evil are less a factor in what people
do than environmental factors and all that stuff. Like when
people do things that are bad, it's more often a
product of their It was kind of a oh God,
what's that fucking psychologist. It was like a Skinner type
argument where it's like, well, if people have bad inputs
in their youth, then that's going to deter And anyway,
(14:01):
I think that's funny. Given what happens, I'm gonna guess
she sucks at what she's she's she sucks, and so
does his dad, Joseph Bankman. Joseph is also a lawyer.
He is a graduate from Yale. His big claim to
fame was developing a proposal for an overhaul of the
California tax return system that would have filled out citizens
tax returns in advance. And I know I just said
(14:22):
he sucked, but actually that sounds like a good thing.
I think that that's actually a cool thing to advocate for.
The measure failed by one boat after heavy lobbying from
into it, a tax pre prep software company. Yeah, it
kind of is. It's it's total bullshit because stuff, it's
the thing everybody agrees with on paper, but nobody will
actually fight the tax prep companies, which is like, hey,
(14:45):
the irs like knows more or less what I make
and like knows more or less what I owe. Why
don't I just get a thing from them? Why do
I have to go through this? Like anyway, there's no need.
But it's like that's the all other countries do it
that way.
Speaker 2 (14:58):
We don't, though, because there has to be a convoluted
system that's expensive and where they can charge you if
you make the tiniest mistake because you can't read size
one font well.
Speaker 3 (15:08):
And more to the point, because I don't actually think
the IRS is advocating to keep it a pain in
the ass. I think it's these tax prep companies because
they have an entire industry based on charging people to
do the thing that they have to do to avoid
going to fucking prison. Anyway, I said he's an asshole,
and I'm sure he is, but he was right about this,
and I don't know what to say about that. Like
(15:31):
all of us. Joseph is also a podcaster. He is
the host the co host of the Stanford Legal podcast.
Speaker 2 (15:37):
Oh It's Horrible Times too, And he's a nerd.
Speaker 3 (15:42):
If it wasn't the holiday season and I wasn't like
getting ready for friends and family and all that good stuff,
I would have listened to his podcast and we would
probably be making fun of him. But you can do
that on your own. Oh goodness, I believe.
Speaker 2 (15:54):
It isn't it doesn't it feel so horrible when you
think of how many people do what we do, but
they're the worst person you've ever heard of. It's so sad,
it's embarrassing.
Speaker 3 (16:06):
Yeah, it's it's like, I don't know.
Speaker 2 (16:09):
I avoid self identifying as a podcaster as it is,
it's still not a system. But then on top of that,
they're like, oh, like what like.
Speaker 3 (16:17):
I mean, you know, you know, the only you know,
the only thing that I can compare it to is
like when I started making a living as a writer
fifteen years ago, and I would say that at like
a at like a party or something, someone asked like, well,
what do you do? And I'm like, well, I'm a writer,
and like four other people would say yeah, me too. Uh,
And then you wind up listening to everybody's pitches for
(16:39):
their novels that they're never going to finish. Oh I
mean so eventually I just started lying and saying that
I still worked as special at well.
Speaker 2 (16:47):
I like, I mean, it's the same thing with like
if you say you're a comedian and a party, you're.
Speaker 3 (16:52):
Oh god no, never never identify as a comedian and
owe me.
Speaker 2 (16:56):
Too, do you? And I've done one whole open mic
and Mike was very offensive and is a comedian's job
to push boundaries and be oh talk joke.
Speaker 3 (17:07):
You know how Lenny Bruce read that, read that list
of curse words? Will I just do that with slurs here?
Let me show you.
Speaker 2 (17:14):
Yeah, you're like yeah, like Laine Bruce was not funny
in that period of his career, even a little. Anyways, Anyways,
our jobs are embarrassing, is what I'm saying.
Speaker 3 (17:24):
Anyway, our jobs are indeed embarrassing. So, as you might
guess from all of that, Sam was born into what
amounts to America's like liberal aristocracy. He is a fucking
coastal elite, right. This kid grows up on the Stanford
campus to Stanford professors. One of his aunts teaches at
Columbia University, is like a professor there. Seems like he
(17:47):
has close family connections to employees at Yale and at
Harvard as well as Stanford, like his parents both go
to Harvard. I think it is Kennedy. He is a Yeah,
he's that. He is as you do not get much
more of a rarefied like intellectual air.
Speaker 2 (18:04):
He's wearing Linen's around.
Speaker 3 (18:06):
This is how I think about Yes, this this is
a child who, at age eight, has strong opinions on
a manual Kant.
Speaker 2 (18:12):
Which again will cheer for him at the table.
Speaker 3 (18:15):
No, oh, we're about to get into that, Jamie Love.
Speaker 2 (18:19):
I just had a I just had a vision of
a child sitting at like a holiday dinner and saying derivative.
And then there's Oh, that's amazing. Wow he is really
coming along, isn't he.
Speaker 3 (18:30):
Yeah, this is a little kid that when he like
sits down at the doctor's office, pulls out a fucking
I don't know, deritter or something book just just just
so you know, just so you know, he knows fancy philosophers.
Speaker 2 (18:41):
Yeah, god damn it. Board book.
Speaker 3 (18:45):
So his parents and his raised him and his brother
to be utilitarians. One of the articles about them into SpongeBob. Okay, yeah, yeah,
I was. I was raised to hassle cows in our
back forty his parents' nights. So this article notes that
nights around the family dinner table often focused around debates
(19:07):
about how to do the greatest good for the greatest
number of people. In later interviews with I'm going to
get to this guy in a second, the absolute dick
writing a journalist to ever write, Dick Sam would claim
that his most formative moment came at age twelve when
he was weighing arguments around the abortion debate.
Speaker 2 (19:25):
So first off, moved on out a point one no
because because not only no, also like in the context
of like where would he have been doing this.
Speaker 3 (19:41):
I'm guessing it like around the family table or when
they have the all. You know, everybody's got their brandy
and he's drinking some sort of fucking tea that's insufferable,
and they're all talking about people's rights as if it's
like a fun intellectual problem, like how to fix it?
Speaker 2 (19:56):
Because for them it is because wherever they land, the
rules aren't going to apply to them anyways.
Speaker 3 (20:01):
Yes, yes, where does he Where does he fall?
Speaker 2 (20:03):
Where does he fall on the debate?
Speaker 3 (20:04):
That's a great question. So I'm going to quote now
from an article previously published by Sequoia Capital and written
by Adam Fisher, who should never be allowed to lift
this article down. When I say the dick writing ist
like fucking pr flack journal, it's it's it's shameful quote.
A rights based theorist might argue that there aren't really
(20:25):
any discontinuous differences, as a fetus becomes a child, and
thus fetus murder is essentially child murder. The utilitarian argument
compares the consequences of each. The loss of an actual
child's life, a life in which a great deal of
parental and societal resources have been invested, is much more
consequential than the loss of a potential life in utero,
and thus to a utilitarian abortion looks more like birth
(20:46):
control than like murder. SBF. That's what they always call him,
the kid Sam fbf's application of utilitarianism helped him resolve
some nagging doubts he had about the ethics of abortion.
It made him feel comfortable being pro choice, as his friends, family,
and peers were. He saw the essential rightness of his
philosophical faith. So that's very fucked up. That is, that
(21:07):
is so deep, Like the term choice is used at
the very end there, but it's clear that like he's
not thinking about this in terms of like the actual
value of human bodily autonomy. That does not weigh in
the utilitarian calculus for him whatsoever.
Speaker 2 (21:21):
No, that is, to quote argue my friend Robert, no, no, no.
Speaker 3 (21:28):
And again, even like, look, I shit reflexively sometimes on utilitarianism,
not because of the inherent value or disvalue of thinking
that way, but about the way it gets talked about
by these people, But like, if you're actually a utilitarian
and you care about the greatest good for the greatest
number of people, then bodily autonomy should factor into that, right,
(21:48):
like human bodily autonomy is should be hugely important to you. Yes,
but no, that's not logical. All that matters is like,
well how many if you've result. If less resources than
this have been invested in the fetus, then it's not
a person. So abortion makes us that's fucking bullshit logic.
Fuck you, you're doing too much math. Stop, this isn't
(22:09):
a math problem, Sam, Like, this is not a fucking
math problem. Not everything's a goddamn math problem.
Speaker 2 (22:14):
Robert, you won't listen to unless you call him SBF,
And I was like, why is talking about sunscreen?
Speaker 3 (22:21):
We'll get that. They all call him fucking SBF and
I hate it. But also as this went on, I
started using it more and more because it's a pain
in the ass to type his whole fucking last name out. Look,
I this is one where I'm not going to give
him a pass, but I get it. If you write
about this fucker a lot, it does make it easier.
So anyway, all of this is very bad. But you
(22:43):
know what's not bad, Jamie Wall, the products and services
to support this podcast.
Speaker 2 (22:48):
That's not true.
Speaker 3 (22:49):
They are that's well.
Speaker 2 (22:52):
I checked. Hold on, I just ran a quick check
on that, and you can.
Speaker 3 (22:57):
But but what about the greatest good for the greatest
number of people? And and given that I'm I'm a people,
so it works out pretty well. It works out very
well for me.
Speaker 2 (23:07):
I think that if you actually have more advertising revenue,
you will actually build a really fast train like you've
been promising me you will.
Speaker 3 (23:15):
Yeah. Yeah, I'm gonna I'm gonna build the hyper loop.
Speaker 2 (23:18):
Yeah yeah.
Speaker 3 (23:19):
And I'm gonna promise you one thing. It's gonna kill
a hell of a lot more people than that Simpsons
Mono rail did.
Speaker 2 (23:27):
And that and and I'm gonna and and look, not
everyone is going to hold you to task for that,
but I am.
Speaker 3 (23:32):
Thank you, Jamie, thank you for keeping me honest and
ensuring that we we we really make a memorable disaster.
Speaker 2 (23:41):
Look, I'm available anytime. I'm not visiting my friend Liz
in jail.
Speaker 3 (23:45):
M hm. We are back. What a good time. So
Sam Bankman Freed is above all else a numbers guy,
and I guess as a kid. He was a numbers kid.
His parents sent him to Crystal Springs Uplands, a fancy
prep school in Hillsborough, California. I looked through the website
(24:09):
because I wanted to make fun of it, but it
just kind of seems like a really fancy school. I
don't know. I'm sure it's a great place to get
an education. They they put a lot of I will
tell this. They devote a lot of screen resources to
letting you know that they are not racist and that
most of their students aren't white. They also have a
French They also have a French cinema class for sixth graders,
which is fine, But the cranky asshole in me that
(24:31):
still has a little piece of my soul raised by
right wing radio wants to say shit about it.
Speaker 2 (24:36):
I was raised by by left wing people, and I
still think that that's some loser shit, man. I think
that that's fucking dorky and goofy and like should it's
like what you know, you meet because you meet people
like that in the wild, and they're sometimes very and
maybe even often very sweet people. But I'm like trying
(24:59):
to be like, oh, you know who Plankton is and
they're like no, and then, but they've been watching French
movies since they were like seven, and.
Speaker 3 (25:07):
I just don't expect that if I made a sixth
grader with it. Yeah, if I meet a sixth grader
with strong opinions about French cinema, like I'm just gonna leave,
I'm gonna leave, I'm just gonna walk away.
Speaker 1 (25:19):
Wow.
Speaker 2 (25:19):
R Robert, that's really brave of you to march out
of a conversation with an eleven year old like I
am not.
Speaker 3 (25:24):
I'm not putting up with that shit. Absolutely not.
Speaker 2 (25:27):
I'm leaving this sixth grade class.
Speaker 4 (25:29):
My name's Robert Evans, and I think, fuck out of here.
Wopis is fucking go watch your What Renoir is that
one of them? That sounds like one of them? Cloud Renoir?
Is he a painter?
Speaker 2 (25:39):
Or Renoir is a painter? I know the one you're
looking for, and I was looking for it too, but
I don't remember. But guess who do you know who
Plankton is?
Speaker 3 (25:47):
Of course you I know who Plankton is, and I
also know that at least one of the directors they
study is a pedophile. Just knowing a little bit about
French cinema that's that's unavoidable. So huh, he does well again,
the school is probably fine. He does well at the school.
He was notably insular. He avoided most of his classmates
(26:08):
to play StarCraft, which is good, and League of Legends,
which objectively sucks. He also played a lot of Magic
the Gathering, So I am confident he did not get
late in high school. This is based on extensive personal experience.
Speaker 2 (26:25):
Like I just actually did some field research and yeah,
about four years of it.
Speaker 3 (26:35):
Okay.
Speaker 2 (26:35):
Interesting.
Speaker 3 (26:37):
For college, he was accepted to and attended MIT, which
marked out his family's elite North American University punch card.
They really hit them all. Now now that they've got
an mi T kid in the family.
Speaker 2 (26:48):
They get a free coffee. There used to be an
mi T. So when I was in comedy in Boston,
there was like MI T had like a secret comedy
club that was just for MI T students and it
was awful.
Speaker 3 (27:02):
Oh god, yeah.
Speaker 2 (27:04):
They paid you, okay, but it was like they're like,
if you like knew someone who like met someone who
went to MI I T and they came to your
shows and they're like, oh we we've we did the
math and you are allowed to come to art. They
called it their speak easy. I wonder if it's still around.
It was so it was. I mean it's not fun. Yeah,
(27:25):
not a fun crowd, I will say. But best of
luck to to whatever was going on there. Yeah, so
he goes to I T. He goes he might have
been at one of those shows where he might.
Speaker 3 (27:37):
Have because he joins a fraternity there. Uh and M
I T. Well, Jamie, it's a an M I T
specific co ed nerd fraternity.
Speaker 2 (27:46):
So nice.
Speaker 3 (27:47):
It's what you go wrong, THETA. And here's how Adam Fisher,
the guy I hate, described them in his article, which
was bad quote a co ed fraternity of super geeks
similarly interesting in magic and video games. Thetans are fun
of debating math, physics, computer science, linguistics, philosophy, and logic
problems for fun at alcohol free parties. Now I do
(28:10):
know a little bit about MIT, and I know another
thing these nerds often do is kill themselves using nitrous oxide,
because they will try to flood entire rooms with nitrous
to do like a twenty percent nitrous to two ratio
and kill themselves. It's a thing that happens. Look it up.
MIT nitrous dead.
Speaker 2 (28:26):
Yeah, all I did in college was drink too many
blue moons.
Speaker 3 (28:33):
It's it's it's quite a thing.
Speaker 2 (28:36):
Yikes. So I don't know doing over there.
Speaker 3 (28:39):
I don't know. I don't know. I don't know. I
don't know how much I believe that that they were
always alcohol and drug free parties, because if there's one
thing I know from nerds, it's that they do a
shitload of drugs.
Speaker 2 (28:49):
I definitely didn't hear of this one because I went
to a couple MIT frat parties and they were not sober.
Fun fact, the one of the MIT frat parties I
went too for some reason when I was in college,
and when I would get really drunk, I would always
I would like I would I would like to like
steal things from wherever I was. And so I stole
(29:09):
two critical pool balls from an MIT frat house and
someone was able to trace it back to me, and
they demanded their poolballs back, and I, embarrassingly, I think
I capitulated. I think I did give them back. I shouldn't.
Speaker 3 (29:24):
Wow. Wow. Actually, so this one kid I'm finding in
ninety nine died because he put a bag over his
head to inhale Nitris, which is fuck man. How did
you get into n it.
Speaker 2 (29:33):
Mi T I don't know any I was in front
of this guy. I don't know.
Speaker 3 (29:39):
But don't put bags. Don't put bags over your head kids.
Speaker 2 (29:43):
Yeah, fucking I learned that in public school.
Speaker 3 (29:46):
No less, so many other ways to do whippets than
putting a plastic bag over your head anyway, whatever. According next,
according to the popular Sam Bankman Freed endorsed version of
the story, he pivoted towards an almost obsessive devotion ethics.
In his freshman year, he went vegan, he organized a
protest against factory farming, and he worried obsessively over how
(30:07):
he could change the world for the better. And it
was at this point that Sam met a man who
was going to change his life forever, William mccaskell. If
you want to learn more about this guy, I do
recommend the episode of It could Happen here on effective Altruism.
This guy is today the most the pop philosopher of
effective altruism and long termism. He is in Elon Musk's
text messages that we all got as a result of
(30:29):
the Twitter lawsuit. At this point, he was also at
MIT and he met with Sam at a cafe in Cambridge,
Massachusetts where.
Speaker 2 (30:39):
Mecaskal which one, which one?
Speaker 3 (30:40):
Which I don't know. I'm sure it's out there somewhere, Jamie,
it's I'm sure you've gotten a hot dog there. That's
that seems likely.
Speaker 2 (30:49):
Now, any place this guy's going it doesn't have hot dogs.
They're not ethical. They're famously unethical.
Speaker 3 (30:54):
So yeah, probably right. So mccaskell explained the concept of
effective altruism him, which is again this idea that like
what matters is you should like think kind of coldly
and robotically about how you do help to make sure
that your charity money does the most that it can do.
But one of the big like arguments about it is
that like, okay, well, what if you, you know, should
(31:16):
you save a drowning child instead of saving like three
kids from a burning building. And it's like, that's a
nonsense choice. Nobody's ever been presented with that choice at
any point in the history of the human race. It
is not a reasonable that is not a there's no
point to that ethical argument. You're not smart for debating
the problem.
Speaker 2 (31:33):
Just draft driver. Yeah, and it makes no sense.
Speaker 3 (31:37):
Yeah, it makes no fucking sense. There's like bits of
it that are reasonable, which is that like, well, you know,
it makes sense to like look at the best thing
you can do financially, you know, in terms of donating money,
is you know, malaria prevention, because it winds up being
the most cost effective thing. But it's like, okay, does
that mean we shouldn't put money into making the water
in Flint, Michigan drinkable? And a lot of these guys
will say no because that's not the best use of money.
(31:59):
And it's like, well, we can do many things with money,
especially if we tax billionaires and put it towards rebuilding infrastructure,
Like a number of things can be done. What Sorry, anyway,
my friend Elon's shitty mecastal kind of pills this guy
on effective altruism. He frames it as a strategic investment
whose success was measured in populations worth of human lives.
(32:22):
He estimated, using back of the envelope math, that two
thousand dollars could save one life, and so a million
dollars could save five hundred people, a billion could save
half a million, and a trillion dollars could theoretically save
half a billion lives. Based on that totally legitimate math,
the only math if people are math. It all works
out that way. Yeah, based on that absolutely real math,
(32:46):
the only ethical way for a genius like Sam to
use his time and talents is to become the world's
first trillionaire. And I'm going to quote again from that
article that I know. SBF listened nodding as mess Caskal
made his pitch. The earned to give logic was air tight.
It was SBF realized, applied utilitarianism. Knowing what he had
(33:06):
to do, SBF simply said, yep, that makes sense. But
right there, between a bright yellow sunshade and the crumbs
strewn red brick floor, SBF's purpose in life was set.
He was going to get filthy rich for charity's sake.
All the rest was merely execution risk. His course established.
Mccaskell gave SBF one less navigational nudge to set him
(33:27):
on his way, suggesting that SBF get an internship at
Jane Street that summer, and.
Speaker 2 (33:32):
So for the good of man, for the good.
Speaker 3 (33:35):
Of mankind, get in the finance industry and gamble.
Speaker 2 (33:38):
Like a mother asshole. I know where God.
Speaker 3 (33:42):
I fucking hate these people so much.
Speaker 2 (33:45):
What God makes sense?
Speaker 3 (33:47):
Yeah?
Speaker 2 (33:48):
You know, look, I think that.
Speaker 3 (33:51):
I can't debate that lot. There's no argument to be
made about that logic, Jamie.
Speaker 2 (33:55):
I mean, I know that this is the wrong person
to be turning on in this moment, but this is
the dick riding like, this is the Jamie's like, he.
Speaker 3 (34:03):
Has not begun to ride dick. This is this is joking.
Speaker 2 (34:06):
He is choking. This man's got no gag reflex. It
is a sign of slowing down, Jamie.
Speaker 3 (34:13):
I'm gonna read you some passages from this that are
going to make you gag. It is unbearable. So and
it's like, this article is like ten thousand fucking words.
It took me like an hour to get through this thing.
Speaker 2 (34:24):
It's mass is He's like, maybe if I do it,
he'll give me a kiss on the mouth.
Speaker 3 (34:27):
Well, well, basically this this was published by Sequoia, which
is a massive like investment fucking fun thing on.
Speaker 2 (34:34):
Their website journalistic entity.
Speaker 3 (34:37):
Yeah, it looks like that. It looks exactly like an
article from like Wired or something like. They clearly laid
it out like that. But I think it was done
because they put like two hundred million dollars into his company,
so they needed to justify it by making him look
like a genius. So it's like.
Speaker 2 (34:50):
Those articles that like are occasionally, I mean, it's more
scary when they're on actual journalistic outlets and then there's
just a little tag saying like hey, this is sponsored
by RuPaul's fracking farm or whatever the fun and it's
like why why Frock all.
Speaker 3 (35:05):
An Exxon Bobile LGBT icons. So yeah, Sam Bankman Freed
gets into finance and he's a very good trader. As
I mean, I have no way to judge this, but
Sequoia says he was a good trader. He was good
at making a lot of money for other people and
also a lot of money for himself. Besides, he gave
(35:26):
away fifty percent of his income to his favorite charities,
but those charities were mostly the Center for Effective Altruism
and eighty thousand Hours, which is also an effective altruism charity.
Speaker 2 (35:36):
What does give money?
Speaker 3 (35:38):
That's gat a great question, Jamie. It allows guys like
mccaskell to live very well, while also saying they only
take thirty thousand dollars in salaries and give away the
rest because their lives are heavily subsidized by these organizations
that allow billionaires to pretend to be heroes to like charity,
So like charity. Yeah. So he remains there happily for
years until twenty seventeen when he begins to feel is
(36:00):
if something is not right? Now spoilers having this party
life crisis, He is having a quarter live crisis. And
this kid absolutely is a con artist. And what I
am giving you is the polished, press friendly version of
the story for a guy whose entire life, as far
as I can tell, was one long setup for an
ambitious con So when I say stuff like he gave
a lot of money to charity, because there's like other
(36:21):
charities he gives to some which sound reasonable, but I
have actually no evidence that he did like, I have
no evidence that he did, and I haven't seen it
one of them. So when I say stuff like he
felt like that, or when they say stuff like he
felt unfulfilled at Jane Street, that doesn't mean he actually did.
Because we are at present reliant on a lot of
reporting from back when this kid was the toast of
(36:41):
Wall Street. Now, after his life fell apart and his
company crash and it became clear that he was a
financial criminal, it also came out that the guy who
wrote the Big Short has been following him for six months,
So I suspect at some point it's got that's gonna
be fun. We're all going to be in for a
real treat when.
Speaker 2 (36:57):
That book hits. I love. I love when you're like,
and guess who is following him around? You're like, Oh,
he's got Michael Lewis on his tail.
Speaker 3 (37:05):
It's like And also, if you're an investor, shouldn't like
it's somebody involved in one of these companies. Probably should
have been able to find out, like, oh, hey, the
big short guys hanging out with him. That probably means
this is a giant financial crime. That guy's not going
to just hang out with a dude who's good at
legally making money to write about how good he isn't
making money legally. That's not Michael Lewis's beat.
Speaker 2 (37:28):
I'm kind of going for like a change of paces time.
I'm just gonna try to see a guy who's like
doing something right.
Speaker 3 (37:34):
Yeah, Michael Lewis, aren't you the guy who only writes
about financial crimes on like a gigantic scale?
Speaker 2 (37:40):
No, no, no, I know this is he's just probably
just trying to network.
Speaker 3 (37:45):
Yeah, just she just really liked this guy's attitude towards altruism.
So anyway, yeah, anyway, here's how that again, very dick
writing pr flack motherfucker wrote about what happens next quote
huh he was, he realized too secure. SBF's mind had
been trained almost from birth to calculate as a schoolboy
(38:06):
that he donic calculus of utilitarianism and him trying to
maximize the utility function measured in utils of course for
abortion during his teenage game I know, I know that's
a sentence.
Speaker 2 (38:20):
Does he say, of course?
Speaker 3 (38:22):
Uh no, no I said that. Oh no, he does say,
he does say, he does say, of course, yes, measured
in course.
Speaker 2 (38:29):
Of course, of course, all suck my ass.
Speaker 3 (38:33):
You find unbelievables. During his teenage gaming years, his mathematical
abilities allowed him to sharpen his tactics and win, and
of course, every trade SBF ever made it Jane was
the subject of a risk reward calculation, all of it
boiled down to expected value. The formula is fairly simple.
If the amount one multiplied by the probability of winning
(38:54):
a bet is greater than the amount loss multiplied by
the probability of losing a bet, then you go for it,
irrespective of units u till's euros dollars. We're all subject
to the same reckoning. But at Jane sbf us most
another trading principle he learned to be risk neutral. In
simple terms, a trader given a choice between a fifty
fifty dollars and a fifty percent chance at one hundred
dollars must be agnostic if they want to maximize the
(39:16):
expected value of earnings over a lifetime. Those who prefer
that sure win are risk averse, and those who would
rather gamble our risk lovers. But both risk lovers and
the risk averse are suckers equally because over the long run,
they lose out to the risk neutral who take both
deals without prejudice. That makes no sense. That makes no
sense at all, because you're assuming you have to like
choose between one. Can you just take both? Is that
(39:38):
like the is that the offer? Because it seems like
the whole thought experiment is about choosing between one. None
of this makes very much sense.
Speaker 2 (39:44):
Like kiz I had a brain hemorrhage in the middle
of that, and then I was thinking, I couldn't stop
thinking about do you think that utilitarians? How do utilitarians
feel about kissing with tongue. Do you think how many
utils does it take to kiss with tongue? Or don't
waste your fire.
Speaker 3 (40:02):
Let me write the equation out, Jamie, and to try
to sketch out the math.
Speaker 2 (40:07):
On I think they wouldn't be into it. I think
they would be like, what's the point?
Speaker 3 (40:11):
Yeah, that's that seems really five utils? Okay, James, Jamie,
I got to continue this say, I don't know, but
we have to read another one quote here SBF realized
was the rub. When he applied this principle to his
own life, he came up short. There was little chance
he'd get himself fired from Jane Street. Thus, the decision
(40:32):
to stick with Jane was a risk averse preference. It
was the logical equivalent of being offered a choice between
fifty dollars and fifty percent of one hundred dollars and
saying give me President Grant. SBF was risk neutral on
behalf of Jane Street, but not He realized for his
own life, to be fully rational about maximizing his income
on behalf of the poor, he should apply his trading
principles across the board. He had to find a risk
(40:54):
neutral career path, which, if we strip away, the trader
jargon actually means he needed to take on a lot
more Oh, Jamie, you need to hear this, boy.
Speaker 2 (41:04):
We're stripping it away. I've been asleep for six minutes.
Speaker 3 (41:07):
Come on, Which, if we strip away the trader jargon
actually means he felt he needed to take on a
lot more risk in the hopes of becoming part of
the global elite. The math couldn't be clearer. Very high
risk multiplied by dynastic wealth. Trump's low risk multiplied by
mere rich guy wealth. To do the most good for
the world, SBF needed to find a path on which
(41:27):
he'd be a coin toss away from going totally busted.
So what path is risk neutral? But that means taking
a lot of risk, because the most risk is the
only way to become the wealthiest person in the world.
And only by becoming the wealthiest person in the world
can you avoid risk? You get it, Jamie.
Speaker 2 (41:44):
Yeah, And that's the most ethical thing you can do.
Speaker 3 (41:46):
Right, That's that clearly the most logical ethical way to live.
Speaker 2 (41:49):
So do you think that they kiss with dun or not?
Speaker 3 (41:53):
I mean, I think what he's saying is ordered in
order to avoid the risk of catching an STD. You
have to take on a job as the bathroom mat
at a brothel. Ah, that's the risk of verse, yeah,
or risk neutral presentation.
Speaker 2 (42:11):
Frantically rubbing my final brand cells together, trying to make
heads or tails of that. And like spark it is.
Speaker 3 (42:18):
It is howling clownshit. It is absolutely sparking nonsense is like.
Speaker 2 (42:24):
Of vortex of bullshit to be like, so anyways, it's
really clear and the math couldn't be clearer that he
has to be the most richest guy or everyone is
going to die, like actually really urgent.
Speaker 3 (42:37):
You know, it's one of those things because I have
I have known a number of rich guys in my life,
and some of them are in late it's for you.
There's two kinds people. There's people who were poor at
one point, and some of those people are unhinged, and
some of them still remember being poor enough to talk
like normal people. And then there's people like this who
(43:00):
Sam was never rich as a kid, but he lived
in this verified air where finance and like the concept
of worrying about money or his economic status was not
a thing because everyone around him when he was a
kid was so high status, right, and he like he's
just lived in this it's not even a it's not
even a bubble. He grew up on a different planet,
(43:21):
Like the world does not exist to him the same
way it does to everyone else. And so he's like,
that's the only way you can talk about things in
this way that or you're just sort of sinister thing
where you're talking about everything like it is very my thinking.
Speaker 2 (43:37):
It's like, it's so easy for him and people like
this to think of other people as theoreticals because they
had never had a problem before.
Speaker 3 (43:43):
So it's like I've never.
Speaker 2 (43:44):
Had a game of chance because I never never met
a person, right, right, he has never known a human being.
Speaker 3 (43:52):
Just Stanford professors exactly, just Stanford professors and problems in
his video games.
Speaker 2 (43:57):
So yeah, and not that all nerd. I mean, I'm
not I'm a I'm afraid of nerd. You know, people
who identify as nerds coming into my mansions. I'm not
saying that that's everybody, but I'm saying, like, he's not
going socialized like a person, and that has to do
with class.
Speaker 3 (44:11):
Yeah, God damn it. So the next thing that SBF
did after deciding he had to quit Jane Street is
start pondering how he might change the world in a
way that minimized his risk. By maximizing his risk or
some shit. Anyway, as he told it, he considered four
career fields poop and this is These are his notes
on the four things he might do after being a trader.
(44:34):
Number one journalism, low pay but a massively outsized impact potential.
Number two running for office or maybe just being an advisor.
Number three working for the movement EA effect of altruism
needs people. Number four starting a startup. But what exactly?
Number five bumming around the Bay Area for a month
or so just to see what happens. And again sounds like.
Speaker 2 (44:57):
A bad bumble day, Like it's just like, so what
do you He's like, om, well, so startup maybe, but
what is a startup? Really?
Speaker 3 (45:05):
It is a startup? Yeah? So again, he spent years
working in finance, He's got plenty of money. He came
from the Bay Area. So five was an option and
it's one he took. And by the way, like as
a general rule, if you decide to quit your job
and you have the financial ability to putter him out
around for a month or two and think things through,
not a bad idea. But Sam is going to do
(45:25):
this in the worst way possible. He eventually hits upon
his great next idea, which is to make a shitload
of money in crypto. Now when he quits Jane Street
is twenty seventeen, and if you guys can remember back
that far, that's the first big winter when cryptocurrency boomed
like kind of all throughout the last quarter or so
of twenty seventeen, Bitcoin was just sailing up like massive rises.
(45:48):
Ether had a big rise too, just kind of went
around for so young early twenty eighteen, a lot of
bitcoin nerds, who people who'd been making fun of her
years became overnight multi millionaires. And this was kind of
the first point at which normal people started to think, shit,
maybe I should get into this, maybe I can make
a lot of money. Right, This is the thing that
blew up bitcoin, and there was a crash after this,
(46:09):
but it recovered, YadA, YadA, YadA. Sam was savvy enough
to look at this and know that these moments where
this thing where a lot of funny money is on
the table but there's no regulation, and regular people who
started to get interested because they think they might get rich.
This is the point at which an unethical person can
make the absolute most money in a financial market.
Speaker 2 (46:29):
Right, and you can be unethical, to quote one of
the greats, you can be unethical and still be legal.
That's the way I live my life.
Speaker 3 (46:35):
And you know who else lives their life that way?
Speaker 2 (46:37):
Jamie loftus you and you're tossing the ads.
Speaker 3 (46:40):
Yeah, that's right. I am, indeed, I am indeed. Maybe
here we go, all right, and we're back. So Jayloft gelosis.
Speaker 2 (46:58):
Yes, it's actually Jay. Oh it's the first. I'm the
first person to use that, and it's really starting to
catch on.
Speaker 3 (47:05):
Bold. Oh, Ben Afflex calling me one sect, Jamie, let
me take this call.
Speaker 2 (47:10):
Oh that's my boyfriend.
Speaker 3 (47:12):
No, he's just weeping outside of a dunkin Donuts and
pocket dialed me again, normal normal Ben stuff? Am I right?
Speaker 2 (47:18):
Old Ben? Look sometimes I meet I meet up with
my friend Ben at the Atwater Dunks and I really,
I really said him straight. It's nice.
Speaker 3 (47:29):
Yeah, Ben affleck sober for years and looks like he's
hung over in every single photograph. Oh what a king man.
I know, I do, I do, I have a lot
of There's just a something about him that warms my heart.
It's his back tattoo.
Speaker 2 (47:43):
You love his back tattoos.
Speaker 3 (47:45):
It's just that takes courage, you know, moral courage.
Speaker 2 (47:48):
I just think he's so amazing fucked up that I
think we need to have that.
Speaker 1 (47:52):
Back tattoo and still got to marry Jennifer a little bit.
Speaker 2 (47:58):
I think I Meansual put.
Speaker 3 (48:00):
His name on the Vietnam Memorial in honor of the
courage that it took to get that back tattoo.
Speaker 2 (48:06):
I really is braver than the troops. And he did
for a while hold the status of most divorced man
in America, but that is now but he fixed it.
Speaker 3 (48:16):
But I feel like the contest I was.
Speaker 1 (48:20):
Gonna say they did they did get married at a plantation.
Speaker 3 (48:24):
I don't know. Yeah that I mean, that's all. That's
all very fucked up. But I do feel the contest
for most divorced man. Did you ever watch Dragon Ball
Z as a kid, Jamie.
Speaker 2 (48:33):
Yes, I didn't. No, I didn't.
Speaker 3 (48:35):
Well, there's this thing that the Dragon Ball Z you
would always do this thing where like you have this
guy and he's like the most badass person ever that
everybody has to figure out how to fight, and it's
this big problem because this is the most terrifying thing
in the universe. And the next season like there's something
that's like a thousand times scarier. It's just this like
power creep kind of thing. I feel like we've all
been dealing with that with a divorced guy, because divorced
(48:55):
guy Ben Affleck doesn't even register on the divorced guy.
Speaker 2 (48:59):
Stay in the age of Kanye and Elon. It's really
divorced and look out because Tom Brady's about to fucking
hit the world trades.
Speaker 3 (49:08):
Tom Brady's gonna go super sayan diforce.
Speaker 2 (49:11):
It's amazing that is going to a third divorce man
has hit the world.
Speaker 3 (49:16):
It is bad. It's bad incredible. So back to Sam
Bankman freed right, he is he has just decided to
get into crypto now the first way to make money
in crypto that occurred to him because he spends a
bunch of time looking into the market and he find
he sees that there's this thing called I think it's
like the the Kimshi Premium or whatever like that. They
(49:40):
come up with some weird, kind of racist name about it,
which is basically bitcoin is worth a lot more in
Japan and Korea than it is in the United States. Right,
It's like worth fifteen grand in Japan and Korea and
it's like ten grand in the United States something like that.
Why is that there's a variety of complicated factors. Basically
there's a bunch of different laws around banking and who
can and cannot hold accounts and execute trades in those areas.
(50:02):
That leads to this premium because like normally, if a
premium like that, if the markets were kind of accessible
to each other, it bitcoin's worth fifteen grand in Asia
and ten grand in the US. Then you buy bitcoin
in the US and you sell it in Asia and
you get free money, right, very obvious if you can
do But you can't do that because you can't get
access to that as an American. You can't like get
(50:24):
a Chinese account in order to buy bitcoin there right,
or a Korean account or a Japanese account. There's all
these laws, so.
Speaker 2 (50:31):
They can't do like a drinking VPN.
Speaker 3 (50:33):
No, you can't. You cannot do that, and no one
can figure out how to do it. How as a
Westerner to sell bitcoin over in these parts of Asia
and get that premium right and like just get a
bunch of free cash. Sam figures out a way to
do it, which is basically like picking up a pile
of free money. Right, if you're buying bitcoin for ten
(50:54):
grand and other people want to pay fifteen grand for it,
you're just making cash, right, And primarily the way he
does that is through friends in the effective altruism community
who are like placed in banks and stuff over in Asia,
who like help him figure out how to do this.
I'm not going to go into details. They I mean,
this fucking dick writing article spends a long time explaining
(51:14):
how he figures this out, and it might even have
been legal. He may not have broken the law to
do this, although it's kind of hard to know because
all of this is complicated finance gibberish.
Speaker 2 (51:23):
By the way, I'm honestly, I'm not getting a word
of this.
Speaker 3 (51:27):
If finance guys call this an arbitrage, and it's basically
the ideal that if you can, if there's a resource
that's worth a bunch more money one place than it
is the other place, you buy it where it's cheap,
and you sell it where it's expensive, and you get
free money. Right that makes sense? Yeah? Yeah? If if
I am James, because if my drug dealer sells me
(51:48):
ketamine for like forty bucks, a gram, right, and I
don't know. At a house party you're hanging out at
a couple of blocks away, somebody gets says that they'd
pay seventy dollars a gram for ketamine. You can make
thirty free bucks by taking the ketamine you bought for
forty bucks and selling it at that other house party, right.
Speaker 2 (52:09):
Okay, so that was you speaking to me in terms
that you would understand. But I do think I got it.
Speaker 3 (52:14):
Yeah, yeah, ketamine makes everything makes sense. Yeah, so you
just gave it to me in Robert speak, I got it.
I got it. He makes a shitload of money doing this,
and he decides to roll this money into a company
which will allow him to hire employees to gamble with
cryptocurrency at scale to try and find different fucked up
little areas like this where they can make a bunch
of money by executing trades. He picks members of the
(52:37):
EA community as his first employees, including Carolyn Ellison, a
former coworker coworker at Jane Street who we will be
talking about in a little bit, and Nishad Sing, a
former Facebook employee Industrial scale dick writer Adam Fisher lets
you know that Sing is an incredible almost impossibly good
human being by describing him this way. He often wears
(52:57):
a T shirt with the words compassionate to the core
printed in diminutive all lowercase font over his heart.
Speaker 2 (53:05):
Do you think that this guy, this writer like just
it's at this point, he's on like a bucking bronco. Yeah,
SBF's dick, Like it's just absolutely is.
Speaker 3 (53:15):
He is fifty percent this guy's dick by weight. It
is unbelievable.
Speaker 2 (53:19):
What does he think is gonna happen for him?
Speaker 3 (53:21):
He's gotta get paid by Sequoia to write a ten
thousand word article that they then pull from their website
When it becomes clear this man is a massive financial criminal.
Speaker 2 (53:31):
He's like, no, my greatest work.
Speaker 3 (53:35):
It's extremely funny. No, look, I don't know sing, But
based on the description that guy gives, I am convinced
he's murdered a child with his bare hands. And that
is my head cannon for this man. No one else
would wear that shirt. So these these EA nerds all
form a trading firm called Alameda, and in doing so,
they came down on one side of probably the biggest
(53:57):
split within the crypto community. See the core of the
idea that's not bad that exists within cryptocurrency is that
decent is that centralized state controlled money is like has problems, right,
you know, there's things about that that are bad. And
it could be cool and useful to be able to
separate the money from the state if you could do that, Right,
(54:18):
if you could do that in a way that reduced
the state's power to like, you know, just lock down
the bank accounts of dissidence and stuff like that, there's
cool benefits to it.
Speaker 2 (54:27):
Well, didn't you, Well I just had an idea, what
if we did that, but then we gave all the
money to one guy.
Speaker 3 (54:33):
Well that's kind of what keeps happening. But also, like
SBFS on the other side of this argument, right, because
obviously most of the actual benefits of a truly decentralized
online currency are just you can buy drugs with it
over the internet. But still that is a real value.
People do in fact buy drugs using cryptocurrency, and that's fine.
(54:54):
And the committed ideological crypto people tend to keep their
money off in a wallet only they can access. Right,
So you basically you have like a hard drive that
has all of your crypto on it and that only
touches the internet when you plug that into your computer
and you use it to make a transaction, right, and
otherwise it's completely offline, and so people can't just take
(55:16):
it from you, right, right, that's the smart people. This
is a pain in the ass though, right, like keeping
it in this thing, like there's all these security you
can lose your password. People actually do lose their money
this way too, But anyway, it's there's a measure to
which it makes sense and is secure. But most people
don't want to go through that pain in the ass,
(55:37):
so they put all of their crypto currency in what
are effectively crypto banks, these exchanges, and these exchanges are
places like mount gox, which a few years ago all
of the money got stolen from, and ftx, which Sam
Bankman Freed makes, which also all of the money gets
stolen from. Right. So the people who are like, no,
you shouldn't do what Sam is doing. You shouldn't make
an exchange because that's not decentralized. And we like this
(55:59):
because it's decent realized, they are the ones who get
robbed less often because they're a little smarter. And that's
part of the point these exchanges. They're meant for people
who don't see crypto actually is like, well, I want
to fight the state by removing my money from the
banking system. Therefore, people who are like I want to
(56:19):
try to get rich quick by gambling, right, And that's
why those people are also the most vulnerable to scams.
Speaker 2 (56:25):
Robert, anytime you explain crypto with me to me, even
though I know I need to understand it for the
context of the episode, I just feel like I'm in
a corner at a house party and I'm holding a
clammy bud light and it's mostly empty, and you're like,
just one more thing though, because I.
Speaker 3 (56:41):
Mean, the important thing to understand is that what Sam
has done so crypto is like unregulated, right, it is
detached from any state, Okay, which is why people like it.
Which is why people like it. Yes, Sam has What
Sam has done is come in and he's not the
first person in them pushon to do this and said
I have built a place where you can keep all
of your crypto and you can trade it with other
(57:03):
crypto to try to make money the same way people
do with the stock market, right.
Speaker 2 (57:07):
Where it is more secure.
Speaker 3 (57:09):
No, because here's the thing, Jamie. He says, it's secure,
but here's the thing. So you know how the banking system,
how banks used to just go bust and everyone would
lose their money and it caused a great depression. You
get that part of the history of finance, right.
Speaker 2 (57:23):
Oh yeah, one of the characters from Titanic killed themselves over.
Speaker 3 (57:26):
That, exactly. Exactly, So that was a big problem, and
we developed a bunch of regulations. So that other thing
I know it is but Robert, So, what Sam has
done is he's built a bank that has none of
that so that people can gamble on the internet. Right, Okay,
so so great depression. Yeah, got it, Yeah, got it exactly.
That's all you need to understand is that Sam has
built a big, unregulated bank for people to gamble with. Yeah. Anyway,
(57:52):
and this is this is it is. It could not
have been clearer that this was a Ponzi scheme. In
twenty eighteen, they put out an advertisement to investors, and
I'm going to read it right now. I think you'll
be like, we offer one investment product, fifteen percent annualized
fixed rate loans. We can accept both fiat and crypto
and can pay interest denominated in either. These loans have
(58:13):
no downside. We guarantee full payment the principle and interest
enforceable under US law and established by all parties legal counsel.
We are extremely confident we will pay this amount in
the unlikely case where we lose more than two percent. Anyway, Again,
I'm not a finance expert. Neither of you, Jamie banks
offer like a three percent return on like a fucking
(58:35):
if you're like putting a pilot cash in a bank
and you're getting three percent back, you're doing okay, fifteen
percent nonsense that nobody, no one can guarantee that it's
not a real product.
Speaker 2 (58:46):
I was pretty struck by their use of the like
which like, we're pretty confident we're gonna be able to
play it sounds like it such a sounds like someone
who is not actually very confident.
Speaker 3 (58:56):
And also if this is if you are, if you
are investing in a in the stock market, right and
someone says this investment has no downside, it's impossible for
it to fail. That person's lying to you and breaking
the law because it could and often does.
Speaker 2 (59:11):
Oh right, because then you're just you're you're making an
almost guaranteed false problems.
Speaker 3 (59:15):
Ye, yes, you cannot do You cannot say this stock
can't go down. Right, If you're a stockbroker, you cannot
tell a client it is impossible for your investment in
this company to fail, because that would be a crime.
But with crypto.
Speaker 2 (59:29):
With crypto, it's uncharted territory, territory.
Speaker 3 (59:32):
You can do anything. This is also a Ponzi scheme, right.
What what I'm what fucking made Off was doing is
he had this investment portfolio that was I forget with
the exact but it was promising an unbelievably high return
and guaranteeing that people would get it right. And what
he was doing is as new people put money into
the investment, he was using their money to pay the
(59:52):
old investors so that nobody noticed that things were fucking up.
But eventually new people stopped putting money into the thing
and it all fell apart. A lot of people lost
billions of dollars, right, like, yeah, I'm back. He was
also using a lot of that money to live, you know,
an incredibly lavish rich guy live anyway.
Speaker 2 (01:00:10):
Sorry, I only understand financial concepts when Selena Gomez breaks
the fourth wall and explains it to me.
Speaker 3 (01:00:15):
I'm doing my best to be your Selena Gomez, But.
Speaker 1 (01:00:19):
Do you know who Selena Gomez is.
Speaker 3 (01:00:22):
Yeah, she's that chick from the thing Nailed It.
Speaker 2 (01:00:25):
She's she's kind of in the middle of a fun
scandal right now where she's in a feud with someone
who donated her a kidney, which is a very funny
online feard.
Speaker 3 (01:00:36):
Do you get feud with that person?
Speaker 2 (01:00:39):
Because she Okay, if you asked Robert, this is something
I can explain to you this so, so what happened
was Selena Gomez needed a kidney donated her close friend
who works in the industry, and I don't know who it was,
but I yues she's like sort of famous, gave her
a kidney a couple of years ago. Then Selena Gomez
turns around a couple weeks ago and says, I have
(01:01:01):
no friends in the industry except Taylor Swift incomes her
kidney donor, being like, oh, that's interesting because I'm one
kidney lighter dude, Like, I'm not quoting it, but yeah.
And then and then Selena instead of polizing, Selena Gomez
is like, oh, sorry, I didn't thank every person I've
ever met in my life.
Speaker 3 (01:01:19):
Yeah, but I mean, Selena, she did give you a kidney.
Speaker 2 (01:01:22):
Yeah, that's a special kind of friend. You guys, weren't
just like drinking buddies. She literally gave you a kidney,
and that she did.
Speaker 3 (01:01:32):
She did the thing that you would use as like
a joking description of someone who'd done a lot for you,
to like, like to hyperbolically say that you owed them
a lot, you know.
Speaker 2 (01:01:42):
Who's done and implying that Taylor Swift has done more
for you than your kidney donor is. So I just
think it's the funniest feed of all time. All Right,
we were talking about a cryptocurrency.
Speaker 3 (01:01:53):
We weren't. We were talking about let's get back to cryptocurrency, Okay,
So to talk about what came next after establishing Alameda,
I'm going to quote again from that Sequoia write up.
At this point mid twenty nineteen, SBF decided to double
down again and scratch his own itch. He would bet
(01:02:14):
Alameda's multimillion dollar trading profits on a new venture, a
trading exchange called FTX. It would combine coinbases, solid stole
it regulation loving approach with the kinds of derivatives being
offered by Binance and others. He only gave himself a
twenty percent chance of success, but In his mind, SBF
needed extreme risk to maximize the expected value of his
lifetime earnings, and therefore the good his earned to give
(01:02:36):
strategy could do. The Fact that he was, by his
own lights overwhelmingly likely to fail was besides the point.
The point was this, when SBF multiplied out billions of
dollars a year a successful cryptocurrency exchange could throw off
by his self assessed twenty percent chance of successfully building one,
the number was still huge. That's the expected value. And
if you live your life according to the same principles
by which you'd trade an asset, there's only one way
(01:02:58):
forward you calculate the expected value use than aim for
the largest one, because in one, but just one alternate
future universe, everything works out fabulously. To maximize your expected value,
you must aim for it and then march blindly forth,
acting as if the fabulously LUCKYSBF of the future can
reach into the other parallel universes and compensate the fail
soun sbfs for their losses. It sounds crazy, perhaps I've
(01:03:19):
been selfish, but it's not. It's math. It follows the
principle of risk neutrality. Yes, it actually is crazy. That's
not math. I'm sorry, that's actually not math math, but.
Speaker 2 (01:03:30):
That was a lot of words all at once. That
is like, oh my god, the spiraling logic is this.
Speaker 3 (01:03:38):
You are using hundreds of words and high minded bullshit
rhetoric to be like gambling is the best way to
make money.
Speaker 2 (01:03:46):
Used fabulously three times in one sentence.
Speaker 3 (01:03:50):
Yeah, man, you know who. I've heard this basic argument
from my friend's drunken Las Vegas explaining what they're trying
to play at the craps.
Speaker 2 (01:03:58):
Why they don't want me to leave the little horsey game.
This is something. Yeah, someone's like, no, don't leave the
sex in the city slot machine.
Speaker 3 (01:04:05):
And you want to hear my mathematical thinking on gambling,
Jamie loftus, because this will make more sense than anything
that happens in the Sequoia article. Okay, when I go
to Las Vegas, I find me the penny slots where
I can see the most waitresses walking around with those
little trays that they have the drinks and stuff on.
Then I sit down at them, and I don't start
to play until one gets close to me, and then
(01:04:26):
I press the button as soon as she walks past,
and I like, catch your attention, and then I get
a free drink, And the way that it works out
is that as long as I can get more free
drinks than I'm spending at the penny slots, and mostly
I'm just reading a book and hiding it while I'm
at the penny slots, and I only press it when
the waitress gets near.
Speaker 2 (01:04:44):
Please, then like the other Vegas guys.
Speaker 3 (01:04:46):
I can drink effectively for free. Right, it works out
to be like twenty five cents a drink if you're
really smart about it. That is my financial advice to
all of you. I do respect that how you make
it even better. I used to go with a bag
because when I was poor. What I would do is
I would go to Vegas once every couple of years,
usually for work, and I would get all the free drinks,
(01:05:07):
which came in glasses a lot of the time, and
I would keep the glasses, and so I was able
to furnish my apartment with stolen Las Vegas glasses.
Speaker 2 (01:05:14):
Love stealing glasses. I've seen my fair share of glasses
and seatimes.
Speaker 3 (01:05:18):
I would get up to the floors where there were
the nicer hotel rooms, and I would find all the
people who'd set out their plates and stuff from like
room service, and I would just take those and take
them back to my house.
Speaker 2 (01:05:29):
I respect that I'm trying to think. I was like,
I don't have a real system for well. Actually, when
I go to Vegas, I always stay at the hotel
with a roller coaster on top. And here's what I do.
I go on the roller coaster once, sometimes twice. Then
I go see one show. Usually it's horrible. The last
time I went to see the Backstreet Boys and guess what,
I found out one of the Battery Boys is in QAnon.
(01:05:51):
And then I got bummed up from this. It was
a realm.
Speaker 3 (01:05:56):
The point I want to make, Jamie is that what
I just described to you, my Vegas strategy has made
me infinite, infinitely more money and net profit than Samuel
Bankman Freed is actually going to make in crypto curts.
Speaker 2 (01:06:09):
Oh fun foreshadowing. Yes, So you never hang out near
waitresses though, because he's probably afraid of women.
Speaker 3 (01:06:18):
He's probably afraid of them. But I am fine with
asking women for a free drink as long as I'm
paying at the penny slots, you know, so it's not weird.
Speaker 1 (01:06:25):
Wow.
Speaker 2 (01:06:26):
Feminist icon Robert Kon.
Speaker 3 (01:06:28):
Robert Evans getting those, getting those shitty Vegas Irish copies
because they come in the glasses I want.
Speaker 2 (01:06:34):
Those are nasty, but I like the glasses they used
to come in. They do have, I know, but then
there's the consequence of having the drink. What's inside?
Speaker 3 (01:06:42):
Well, you know, Jamie, that's why they call me a hero.
Nobody the Mahatma Gandhi of the West. They call me
the Jesus Christ. Podcasting, these are all things people don't know.
Speaker 2 (01:06:52):
If you're gonna lie, make it realistic, James.
Speaker 3 (01:06:56):
So anyway, let's continue. I'm like, yeah, I think it's
hard to justify being risk averse on your own personal
impact at impact, SBF told me when I quized him
about it, unless you're doing it for personal reasons. In
other words, it's selfish not to go for broke if
you're planning on giving it all away in the end. Anyway, again,
(01:07:17):
just clownshit. So all of this is a con spoiler,
so you don't have to think that much about it.
In a recent series of text messages with a Vox journalist,
after his entire exchange exploded and everyone found out he
was a financial criminal, Sam Bankman freed more or less
admitted that everything he'd had to say about effective altruism
was a con meant to get people to trust him
(01:07:39):
and invest in his company. And I'm going to read
the texts. Yeah, I'm going to read the texts to
you between him and this journalist who, by the way,
he put money in the box, so he helped fund
this journalist.
Speaker 2 (01:07:49):
Wow.
Speaker 3 (01:07:50):
So the ethics stuff, this is the journalist. So the
ethics stuff mostly affront people will like you if you
win and hate you if you lose, and that's how
it all really works. Sam, Yeah, I mean that's not
all of it, but it's a lot. The worst quadrant
is sketchy and lose. The best is win plus question
mark question mark question mark clean plus lose is bad
but not terrible. He also misspells terrible, but whatever. The
(01:08:13):
journalist then replies, you were really good at talking about
ethics for someone who kind of saw it all as
a game with winners and losers. Yeah, he he. I
had to be what it's what reputations are made of.
To some extent, I feel bad for those who get
fucked by it. But this dumb game we woke Westerners play,
where we all say the right shibboleths, and so everyone
likes us and that's the actual truth here, right, That's
(01:08:35):
the thing that's honest about it. It's like, hey man,
it was all of this talk for everyone. That's the
entire this miss Castle motherfucker. The only reason his effective
altruism thing exists as a funded thing is a is
a fucking shibballeth for billionaires who don't want to pay
taxes and want to let the world crumble around them,
will sucking as much value out of the working class
as they can, and want to pretend like they're heroes
(01:08:56):
at the same time so that people write their dicks
and articles like that fucking sequoia piece.
Speaker 2 (01:09:00):
That absolutely fucking ridiculous text actually is like a very
important document.
Speaker 3 (01:09:06):
It's incredibly important. It's deeply crucial.
Speaker 2 (01:09:09):
I hate that there is such a crucial document that
also includes he he in it.
Speaker 3 (01:09:13):
Yeah, there's nothing to be done about that one, Jamie.
Speaker 2 (01:09:16):
Look, it doesn't feel good, but you know, our previous
most important document to that effect was an im that said, haha,
So a text with he he is kind of the
logical progression that is fascinating. Do you have any insight
into like why he would so freely admit that now.
Speaker 3 (01:09:32):
I don't actually one of two things has to be happening,
because again the spoilers, this all falls apart. His exchange
goes from worth thirty two billion dollars to worth basically
zero dollars in the space literally in my head.
Speaker 2 (01:09:43):
Anytime you say.
Speaker 3 (01:09:44):
Like twenty four hours this happens. His net worth falls
ninety four percent in a day, like it is it
all it collect because they realize that all of the
money is gone, that he'd been taking money from one
business and using to gamble in another and also to
pay him and his friends, and all of the money
that the investors had put in when they tried to
withdraw it, like their money that was supposed to be
(01:10:06):
in there on paper, none of the money existed because
again he'd stolen it. Anyway, the context of this article
makes it clear that he felt like, I don't know whatever.
This was all a confidence game, right, That's the key
all of this could work. And the balance she's because
people were looking at their balance sheet. I'm making money.
I'm making money, the returns are great, and that money
(01:10:26):
existed on paper until they tried to take it out,
because then it actually wasn't there because he had already
frittered it away. It's a confidence game, and we have
an you know that's the way it actually worked. We
also know and this is all still coming out, so
I'm not going to get too much into it, but
we know that he had FTX loan himself. Sam bakmanfried
about a billion dollars, Like his paper value was like
(01:10:49):
twenty two billion, but he gave himself basically a billion
dollars in other people's money, although he may have gambled
that away. It's really unclear how much money he actually
has liquid at the moment.
Speaker 2 (01:11:01):
Do we think he has any.
Speaker 3 (01:11:03):
I have no idea either. Either he was like actually
a gambling addict and a narcissist and he really did
lose it all or this was a con from the beginning,
knowing it would all collapse, and he got as much
as he could out of it, and he's going to
wind up someplace without extradition, right like, and that was
the goal.
Speaker 2 (01:11:18):
If you ask me a couple of years ago, I
would have said it's the latter. But I feel like
the last couple of years have demonstrated so often that
like people are just straight up not smart and don't
have a plan, and it is all a narcissist shell game.
Speaker 3 (01:11:29):
It's very it's Unwow, you're at the moment. I'm going
to read you some things at the end here and
you can kind of make your anyway whatever. So after
you know, starting FTX, the company moves to Hong Kong
and then the Bahamas and they use they buy these
very like a thirty nine million dollar manch that he
lives in with his friends using FTX tokens, which is
like internal cash that his company issues based on the
(01:11:51):
perceived value of the company.
Speaker 2 (01:11:53):
They don't know how many rooms is that.
Speaker 3 (01:11:56):
It's help a shitload and they're able to buy it
because everything's like their paper on paper. They've gone from
nothing to worth thirty two billion dollars and like a
year or two and these these idiot like property owners
and the Bahamas are like, well, clearly, the best thing
we could do is buy this building using the fake
money they created for their own company that they tell
us is worth a lot of money. This is worth it.
(01:12:21):
So since everything collapsed, some people that SBF had like
reached out to as early investors have commented about why
they didn't invest in this company in the early days,
and most of them it's because, like what they could
see of his investments, the tens of millions that he
promised to charities in the long positions in risky cryptocurrent
companies didn't make sense. They would look at like the
(01:12:41):
things he was buying with the company assets and the
things that like he was investing in and be like, well,
there's no way he could have that kind of liquidity.
If this is a legitimate exchange, right, he can't have
that kind of cash on hand. It doesn't make any sense.
Speaker 2 (01:12:54):
Rich fucking assholes always telling themselves like that, Yeah, that's funny.
Speaker 3 (01:12:58):
And what's funny is that, like I found one of
these guys who like, yeah, I didn't invest because I
could tell it was a con and then was like,
but I didn't tell anybody because I didn't want to
get yelled at.
Speaker 2 (01:13:11):
That's unfortunately. I do see myself a statement where I
was like, oh, someone might be someone might send me
a rude text.
Speaker 3 (01:13:22):
I guess I will this.
Speaker 2 (01:13:24):
Yeah, oh my god, it's very goofy bullshit. I don't know.
Speaker 3 (01:13:29):
I'll tell you one thing, Jamie. He's a spineless guy
who still has his fucking money.
Speaker 2 (01:13:35):
Wow, you're in love with him? That's what? Wow?
Speaker 3 (01:13:39):
Yeah, you know whatever. It's the finance industry. They're all ghouls,
sools from what we can actually tell now, because again
this fucking captain Dick Writer, the bad writer, Like, I
have to read you another quote that I didn't have
in my script to give you an idea of just
how much he fucking how much he loves SBF. Here's
(01:14:03):
a quote from him. Okay, yeah, this is this is
actually Jamie. Oh boy, this is when, this is when,
this is when, this is when the writer Captain Dick
Rider is hanging out with him in the Bahamas at
his office.
Speaker 2 (01:14:20):
Quote he's like, what have we kissed?
Speaker 3 (01:14:23):
Sensing an opportunity for connection, I chip in with my
own two satoshi, which is two cents but a bitcoin term.
Speaker 2 (01:14:30):
Anyway, I'm going to walk into the ocean. Wow. Okay.
Speaker 3 (01:14:34):
I don't pay any attention to social media, not because
I have any moral case against it, I say, but
because for me, reading books is the highest bandwidth way
I know to get quality information into my brain, which
just craves the stimulation I'm addicted to reading, which explains
how I ended up being a writer. Oh yeah, said
SBS says SBF, I would never read a book. I'm
not sure what to say. I've read. I hate them
(01:14:57):
both because I'm not sure what to say. I read
a a week for my entire adult life, and I
have written three of my own. I'm very skeptical of books.
I don't want to say no book is ever worth reading,
but I actually do believe something pretty close to that
explains SBF. I think if you wrote a book, you
fucked up, and it should have been a six paragraph
blog post.
Speaker 2 (01:15:16):
Meanwhile, guy's writing a ten thousand word article, writing a
stick that he will not read. It's like two guys
whose heads are made out of rocks, just on like
clonking against each other repeatedly.
Speaker 3 (01:15:29):
Just wait, Jamie, Jamie. It has it hasn't gotten as
bad as it's going to get. Oh no, whatever the case,
I hate books. Whatever the case, I find myself sad
for the man, and it occurs to me that my
reaction is exactly what might be expected from a beta.
In the Brave New World, Crypto is creating whoa how
(01:15:50):
can you write that and not leap off the top
of a building?
Speaker 2 (01:15:54):
He literally self identified as a beta to and.
Speaker 3 (01:15:57):
Also like What I love about this is just like, well,
we have to take it. We have to take it
as given that Crypto is creating a brave new world
and none of us has any choice in that. It's inevitable,
it's unstoppable, it's going to dominate everything, which is.
Speaker 2 (01:16:09):
Like capitulate to our bb boo boob overlords.
Speaker 3 (01:16:12):
Yeah, so I think again, what I wonder does he think?
I think? Wouldn't someone with IQ points to spare realize
that dismissing books, all books is essentially worthless? Might rile
a writer? Was he playing with me? Is this fun?
Is this humor? I'm satisfied with my meta analysis until
I realized that one can always increment the level of
strategic play in this sort of game. It's like poker.
(01:16:33):
Level one is just thinking about how to strengthen your
own hand. Level two is thinking about what your opponent's
hand is. Level three is thinking about what your opponent
thinks your hand is, and so on. And since SBF
is obviously a genius, I should simply assume that compared
with me, SBF will always be playing at level N
plus one, which was which makes my analysis of the
intent behind SBF's books for losers idea spiral into infinity
(01:16:54):
and crash like a computer.
Speaker 2 (01:16:57):
Is how you write about me and your diary.
Speaker 3 (01:17:00):
No, Jamie, because you know, do you know what the
only ethical speaking of ethics, you know what, if you
think this way about conversations, the ethical thing to do
is fill your pockets with rocks and walk into the ocean.
Speaker 2 (01:17:13):
Wow, you're wolfing this guy.
Speaker 3 (01:17:15):
Yeah, I absolutely am him and Sam Bankman Freed. I
could not hate these people more.
Speaker 2 (01:17:22):
I mean, they're both there. There's never been two wronger
people having a conversation.
Speaker 3 (01:17:28):
And of course it came out. He's just like, well,
obviously he's a genius. We have to assume that because
he became a billionaire. And he's like, no, he was
never a billionaire.
Speaker 2 (01:17:35):
Saying his defense, that's math.
Speaker 3 (01:17:37):
He did think that's a balance sheet. Like we've now
gotten access because he had to like go to the
bankruptcy and step down. So like now there's a caretaker
trying to get people's money out of the company, and
we know shit, like we've seen the Excel files where
they kept their financial records, and it's him being like
this is basically bullshit, Like sorry about this, we fucked
this up. We weren't actually keeping records here the company
(01:17:58):
balance sheet. There was no counting department people would file
their like like when they would spend hundreds and tens
and hundreds of millions of dollars in things, they would
just message each other on signal about it to get
approval and had deleted messages on So there's actually like
no record of a lot of the accounting. They did
at one point hire and outside accounting firm to handle
(01:18:19):
their accounting of this thirty two billion dollar company, and
the firm they hired is the first. It bills itself
as the only metaverse based accounting firm.
Speaker 2 (01:18:28):
Oh god his.
Speaker 3 (01:18:30):
The accounting firm for this thirty two billion dollar company
exists entirely within a crypto themed video game called decentral Land.
Speaker 2 (01:18:39):
It is so fuck You're stupid with everything you're saying. Okay,
so this is really this is bad and this feels
bad to hear. I have a question, Yes, Jamie, who
is I guess because I am I am kind of
back in in Lozsey holmes Land, because that whole lasts.
I mean, like departments that should be you know, taking
(01:19:02):
care of shit don't even exist, which is very Farohose
adjacent to me. Yeah, Fernosian, if you will, As we've
both written books, we can just make up for Yeah, of.
Speaker 3 (01:19:14):
Course, Yeah, that's that's mostly what writing a book is.
For sure.
Speaker 2 (01:19:18):
I view myself as the beta of this conversation, and
so I feel comfortable asking you, Robert, who is ultimately
affected by this? Like what is the like trickle down
of this? What happens? Is it just other rich assholes
or does it affect regular Like does it affect.
Speaker 3 (01:19:41):
Probably there will presumably be some here's the thing, and
here is why there's that lawsuit against like Larry David
and all those other guys who appeared in the FTX. Yeah. Yeah,
presumably a bunch of regular people got suckered into putting
their money on FTX, and those people are probably lost
some money. That said, for the most part, it's fine
(01:20:02):
because most of the people who lost money are like
gamblers who probably suck as much as this guy did.
And it's one of those things. There's just an article
I think at Financial Times where someone's like, actually, and
I think they actually had a good point. We shouldn't
regulate the crypto industry because if we regulated, it will
be brought in closer to the actual financial industry as
(01:20:22):
it exists, and banks will put more investments into crypto
and it will get seen as like legitimate and backed
by the state. And so when these conmen destroy tens
of billions of dollars overnight and cause panic in the industry,
it will affect the real economy. And right now it
doesn't seem to And like, yeah, I guess that is k.
That's not a bad point. Maybe we just let it
die on its own.
Speaker 2 (01:20:41):
I don't know, and it seems like sot more like
we'll know more.
Speaker 3 (01:20:46):
Yeah, so, yeah, we will continue to learn more. One
of the things that's funniest about this is that Sequoia,
this investment firm, put like two hundred million dollars into
the company, which they have all written off. Now they're
accepting it as a total loss.
Speaker 2 (01:21:00):
Going full back girl on this.
Speaker 3 (01:21:01):
Okay, Now, when you hear this very serious investment firm
put two hundred million dollars into this business, you probably
assume Jamie Well, I bet he had a good pitch,
right I Actually.
Speaker 2 (01:21:12):
I don't know if we're talking farahns. I actually don't
think that that is that's not just qualifying to have
a dogshit pitch.
Speaker 3 (01:21:18):
You know you know who can make this clear for
us is Captain Dick Ryder. Oh. Thanks, SBF told Sequoya
about the so called super app. I want FTX to
be a place where you can do anything you want
with your next dollar. You can buy bitcoin, you can
send money in whatever currency to any friend anywhere in
the world. You can buy a banana. You can do
anything with you you want with your money from inside FTX.
(01:21:39):
Suddenly the chat.
Speaker 2 (01:21:44):
Banana cod I don't know.
Speaker 3 (01:21:45):
I feel like I can do anything I want with
my debit card. Like I've never run into a thing
I wanted to buy and been like, ah, I cannot.
Speaker 2 (01:21:55):
No, how do I act? I can even buy.
Speaker 3 (01:21:57):
Drugs with it by going to an ATM. If I
were to be a person who buys drugs, which I'm not,
I could go to an ATM and take out cash
and purchase support. You're a local drug.
Speaker 2 (01:22:07):
Dealer and banana vendor.
Speaker 3 (01:22:10):
For crying out, I'm beloud. So he gives this banana
pitch quote. Suddenly the chat window on Sequoia's side of
the zoom lights up with partners freaking out. I love this,
founder typed one partner I am a tin out of
tin pinged another, Yes, exclamation point, exclamation point, explaclamation point
exclaimed a third. What Sequoia was reacting to was the
(01:22:31):
scale of SBF's vision. It wasn't a story about how
we might use fintech in the future, or crypto crypto
or a new kind of bank. It was a vision
about the future of money itself, with a total addressable
market of every person on the entire planet. I sit
ten feet from him, and I know it's these people
are just.
Speaker 2 (01:22:50):
Shows three executives doing lines of coke and one swallowing
a banana with the peel still on. They're like, yes, yep,
oh my god, Okay, yeah, so it's I mean, which
does kind of continue with the trend of like, you know,
as as MEF has never had a problem or a
(01:23:10):
like anything to overcome, Like if it's this easy for
him to con people into shit, like, of course you
would have a god complex. You've never you've never been
told no.
Speaker 3 (01:23:21):
Yep, yep, yep, yep, yep, it's cool. So what Sequoia
was re yeah, sorry, I have to continue this this
fucking quote, and next he's going to talk about a
person who works at Sequoia and is in the room
for this. I sent ten feet from him, and I
walked over thinking, oh shit, that was really good, remembers Aurora.
And it turns out that fucker was playing League of
Legends through the entire meeting. And this is framed in
(01:23:44):
the article and in all the coverage before everything fell apart,
is like, so awesome, He's so cool. This writer talks
a bunch about how like Sam never stops playing video
games when he's talking to this writer, when he's having
corporate meetings, he's playing video games basically one hundred percent
of the time. And this is always mentioned, is like
he's always working. He's always in the office. He sleeps
in a bean bag chariot his desk, and it's like, no, dude,
(01:24:06):
he's not always working. He's conning you and he plays
video games all the time and pretends that that's a
fucking job, which is great, great con good for.
Speaker 2 (01:24:16):
You, buddy, always working in always there two very different
flavors of things happening exactly.
Speaker 3 (01:24:22):
Yeah, it's all part of the fucking con. So is
the fact that he always he always wore like ratty
old athletic shorts and like a wrinkled T shirt because like,
that's for if you are a young man in the
tech industry. That makes people think you're a genius, right,
because genius is dressed like shit.
Speaker 2 (01:24:39):
Yeah, He's like, now if I bring a real stink
into the room, people are gonna like jack up my
already fake IQ score about twenty points. Yeah, this is
fucking horseshit.
Speaker 3 (01:24:49):
And it's what you know who else dresses like shit?
The guy I used to buy weed from, Oh yeah
yeah yeah. Also, you know who wears the same outfit
as Sam Bankman Freed, my old buddy who once at
a party got into an argument with a guy and
broke fifteen bones in his face because they were both drinking.
(01:25:12):
Not a corporate genius. I do like that.
Speaker 2 (01:25:15):
Shared aesthetic where it's like, really, the difference is the
pet snake that is, Yeah, that's the that's that's how
you truly can sniff out Yeah, the million.
Speaker 3 (01:25:29):
He's basically he's the same kind of person as Elizabeth
Holmes and he was again he's a confidence man. And
this gets us into the Larry David shit because the
thing about being a confidence man is that as long
as people are convinced, there's money. Their money is safe,
and most of them don't try to pull it out.
Then you can keep the con going and you can
keep the fake numbers increasing, and you everyone will think
(01:25:49):
you're richer and you can actually get real money out
of this. So one of the things that he did
is he would pour shitloads of money into sponsorship deals
and to other ventures to make his companies seemed legit.
One way he did this come in Yeah. He spent
seventeen and a half million through FTX to sponsor the
athletic teams at UC Berkeley. He launched a twenty million
dollar ad campaign with Tom Brady and Jaselle Bunchin. He
(01:26:12):
offered NFTs at coachellyd uh huh. And he spent one
hundred and thirty five million dollars on the naming rights
for the Miami Heats Home arena. And this is all
to build confidence, right you see. It's the fuck. It's
the same thing Crypto dot com did, by the way
with the arena, and what to say.
Speaker 2 (01:26:27):
I was like, I feel like the Crypto dot com
arena is not long for this world, and you so.
Speaker 1 (01:26:32):
We don't should I don't recognize that as an act?
Speaker 3 (01:26:35):
Is my money safe? In this thing that's a bank,
but not a bank. Well, their name is on the arena,
so it's probably legit.
Speaker 2 (01:26:43):
Here's the thing is like, Yeah, it's like now walking
into the crypto dot com Maria Arena. I couldn't feel
less safe. I couldn't feel less secure nobody like when
it was the Staples Center. I'm like, oh, you know,
it's never going to go out of style. Little books.
Speaker 3 (01:26:57):
People have been using Staples since we were cavemen.
Speaker 1 (01:27:00):
And I assume so, yes, you technology just like the
wors fucking name on Earth.
Speaker 3 (01:27:05):
It's it infuriates me these people.
Speaker 2 (01:27:08):
Yes, Sophia has a dog in the fight I do.
Speaker 3 (01:27:11):
In his interview with Vox, Sam basically admits this, albeit
in a slightly careful way. Journalist so FTX technically wasn't
gambling with their money. FTX had just loaned their money
to Alameda, who hadn't who had gambled with their money
and lost it. And you didn't realize it was a
big deal because you didn't realize how much money it was,
Sam responds, and also I thought Alameda had enough collateral
(01:27:32):
to reasonably cover it. Journalist says, I get how you
could have gotten away with it, But I guess that
seems sketchy even if you get away with it, Sam,
it was never the intention. Sometimes life creeps up on you.
Speaker 2 (01:27:45):
So said, life comes at you fast.
Speaker 3 (01:27:47):
Life comes fast. So Sam's needworth taps at it around
twenty two billion dollars on paper. In reality, neither Alameda
nor FTX had ever taken in even close to that
much money. The valuation was based entirely on nonsen sense
calculations that were themselves based on lies from ftx's extremely
cooked books. There's a lot more about this than we're
getting into. People are still finding this all out. There
(01:28:09):
is one thing I should probably read, which is, so
you know, when his company collapsed, he had to step
down from running it, right, and because a lot of
money is still in there and a lot of like
investments are still tight up in that, they put a
guy in charge of the company again, right. And there's
like there's specific dudes in the business world whose like
(01:28:29):
job is to come in when a company fucks up
like this and like try and get as much money
back out for the shareholders as possible to minimize the bleeding.
Speaker 2 (01:28:39):
So they kind of have like a like how they
they had like old Hollywood. They have like a fixer guy.
Speaker 3 (01:28:45):
Yeah, yeah, yeah, they have a fan and this is
this fixer guy. Specifically, he is the guy that they
brought in when Inron. He took over INRN after it
fell apart in order to try and like minimize the
damage from it. So he is the guy who got
brought in to deal with like the fact that this
massive fucking crime happened with Inron one sec.
Speaker 2 (01:29:05):
My favorite, my favorite n RUN memory, not that you asked, Rud,
was the women of Enron playboy spread that I got
to archive during my time there. Oh god, boy, did
those Enron girl bosses have their their so as second
only to Women of seven to eleven, which is my
(01:29:26):
actual favorite spread. Continue.
Speaker 3 (01:29:28):
First off, you note this company has about a million creditors,
so about a million people possibly lost their entire investment
in this company, which is a stunning amount of people
to take money from. And again they're probably we are
probably looking at like five or ten billion dollars stolen
something like that. It's kind of unclear the exact amount,
but anyway, this is what the guy in the Delaware
bankruptcy court filing. This is what the guy who was
(01:29:51):
the guy who took over in Ron after it became
clear that the whole company was a criminal enterprise. This
is what that guy wrote about Sam's company. No, Okay,
Never in my career have I seen such a complete
failure of corporate controls and such a complete absence of
trustworthy financial information has occurred here. From compromised systems integrity
and faulty regulatory oversight abroad to the concentration of control
(01:30:15):
in the hands of a very small group of inexperienced, unsophisticated,
and potentially compromised individuals, this situation is unprecedented. Again, that's
the guy who took over in ROD.
Speaker 2 (01:30:26):
I was like, he's literally like Bill Clinton dropping a
notes post being like, never have I seen a more
cheated on my wife in anything?
Speaker 3 (01:30:35):
Like so absurd? Well, I will say it's different. This
guy did not commit any of the in Ron crimes, right,
This is the.
Speaker 2 (01:30:42):
Guy who saw them all.
Speaker 3 (01:30:45):
This guy is after he's brought in it becomes clear
that the.
Speaker 2 (01:30:50):
Last guy starting consents with. Never in my career this guy.
Speaker 3 (01:30:53):
Yeah, it's probably best to look at this guy as
like an e MT. When it becomes clear that a
company that has a shitload of money in it and
is central in the economy has collapsed because people broke
the law. He comes in to minimize the damage. But
he was not working at Enron previously, right, Like, it's
not he's not trying to like stop anyone from getting
in trouble. He's trying to minimize how many people are
(01:31:15):
hurt by this. Yeah, anyway, well of course, yeah, I
just want to make clear, like that guy's anyway whatever.
Speaker 2 (01:31:21):
Yeah, he's not the en Runner.
Speaker 3 (01:31:22):
He is not the end runnor he did not make
in Ron bad. He's just was there afterwards and was like,
this company's even worse.
Speaker 2 (01:31:29):
He bought the issue of Playboy and then he kept
it push it.
Speaker 3 (01:31:32):
Yeah, so I should God, there's yeah, we're we're getting
close to done. I should note that before everything collapsed.
Sam again, he's in the effect of altruism. He's he's
he promised to donate like a couple one hundred million
dollars to these EA causes. A lot less than that
actually wound got out. Some of them were good, but
a lot of it was like so he made a
huge point that he was like his one of his
(01:31:53):
major priorities was pandemic prevention. Right, we have to stop
the next pandemic. I'm going to put as much money
as I can to pandemic prevention. That's the best effective
altruist thing that I can do. To talk about how
well that actually worked, I want to quote from the
Washington Post here. FTX backed projects ranged from a twelve
million dollars to champion a California ballot initiative to strengthen
(01:32:14):
public health programs and detect emerging virus threats. Amid lackluster support,
the measure was punted to twenty twenty four to investing
more than eleven million on the unsuccessful congressional primary campaign
of an Oregon biosecurity expert, and even one hundred and
fifty thousand dollars gram to help Moncleff Slough, the scientific
advisor to the Trump Administration's Operation Warp Speed vaccine Accelerator,
(01:32:35):
write his memoire. So that sounds like a giant waste
of money, right, that sounds like this none of it ran,
even if it was like good, it sounds like it didn't.
Like even if the goals were good, like well, the
ballot measure failed like it or got punted. So it's
not like it worked. And it gets worse because SBF's
fund also put a lot of money, like five million
dollars into Pro Publica. In Pro Publica, they've done a
(01:32:56):
lot of cool stuff. They also published extremely flawed investigation
that backed the lab leak hypothesis. I'm going to the
La Times and their analysis of this deeply flawed piece
of reporting.
Speaker 2 (01:33:10):
And also, you know, we've got some notes for the
La Times as well.
Speaker 3 (01:33:13):
Yes, nobody's perfect. The La Times called it a train wreck,
noting the article is based heavily on Chinese language documents
that appear to have been mistranslated and misinterpreted, according to
Chinese language experts who have piled on via social media
since its publication. It also takes his gospel or report
by a rump group of Republican congressional staff members asserting
that the pandemic was more likely than not the result
(01:33:35):
of a research related incident. And this has been the
fact that Pro Publica published this has like provided a
shitload of fuel to the it was all a fucking
lab leak from China. It's China's fault. Republican Yeah, Sam
Bankman Freed funded that shit, and my god, yeah, basically
none of the shit he was putting in money into
(01:33:56):
that was supposed to be good really worked, and a
lot of it was another thing the right is doing
right now as they're talking about. He was like the
number one or number two donor to Democrats during the
midterm elections, right, Seth MacFarlane.
Speaker 2 (01:34:10):
Yeah, yeah, but.
Speaker 3 (01:34:11):
None of his donations worked. And also he gave it
was like thirty two million he gave to the DIMS.
He gave like twenty four million to the Republicans. And
the reason he was giving this money number one, there
were some like pro pandemic response candidates he wanted to back,
most of whom didn't you know, do well. But also
like a lot of the money was towards Republican and
(01:34:31):
Democratic candidates who were going to be part of the
regulation of the crypto industry because he wanted to have
a seat at the table and push regulations in a
way that.
Speaker 2 (01:34:38):
Yeah, okay, so yeah, candidate that would enable.
Speaker 3 (01:34:41):
That is yeah, exactly. So. Anyway, in general, nothing at
Alameda or FTX was as it seemed, and that Dick
writing Sequoia article Carolyn Ellison, the CEO of Alameda is
he talks about her a bit and like she frames
her as like this, quintessential innocent nerd girl, O plucky
and ethical and optimistic to show like these are the
(01:35:02):
kind of you know, smart young gin z kids that
are you know, building this this great company, and like
she showed up in larp gear to meet with Sam
and talk about the future of their great financial enterprise,
and she's an ethical altruist. Since everything fell apart, it's
come out that she and Sam were dating each other
and possibly other members of the company. And also people
(01:35:23):
have found.
Speaker 2 (01:35:24):
Her of the full Liz Holmes. Look here we go.
Speaker 3 (01:35:27):
People have found her tumbler, and boy is she sketchy
as hell. I'm going to quote from a report on
her tumbler activity in dcrypt dot com. Boy howdy. When
I first started my first foray into Polly, I thought
of it as a radical break from my trad past,
the account wrote in February twenty twenty. But tbh, I've
come to decide that the only acceptable style of Polly
(01:35:50):
is it best characterizes something like imperial Chinese harem. The
account went onto detail how a polyamorous dynamic should ideally
function as a cutthroat market of sexual competition and subjectation.
None of this non hierarchical bullshit. The account elaborated everyone
should have a ranking of their partners, people should know
where they fall in the ranking, and there should be
vicious power struggles for the ranks. Good it gets worse, Jamie.
(01:36:12):
The Ellison linked account also demonstrated a substantial preoccupation with
HBD or human biodiversity, an online euphemism for the discredited
fields of race science and eugenics. Popularized Wait all right,
oh oh boy, give me one more paragraph and then
we can talk about this. Jamie Ellison has for years
vocalized her Dieheart obsession with Harry Potter. In one post,
(01:36:35):
her affiliated Tumblr account tied her love of online character
crizes Quizzz to her pin shot for sorting Indians by
their cast, which she presumed to indicate genetic distinction. Oh
my god, holy shit.
Speaker 2 (01:36:48):
Even JK Rowling wasn't thinking something that fucked up. And
that's really saying something someone.
Speaker 3 (01:36:54):
Is amazing astonishing.
Speaker 2 (01:36:57):
Tumblr is exclusively for sure fan fiction and things that
trigger my eating disorder. That's the hit I cannot fuk like, Oh,
that's so dark.
Speaker 3 (01:37:07):
It's I almost can't fathom it. Right, And again, there's
so much. We probably will do a follow up at
some point because like the fact that the fact that
it was this easy for like some fucking crypto rag
they're the only ones who reported on it. To find
her tumblr where she talks about race science, makes me
think these guys were all probably into a lot more
fucked up shit than they let on. H Yeah, so
(01:37:28):
we'll see, God, we'll see. I just.
Speaker 2 (01:37:34):
We don't have the hour for me to decompress the
way I need to after hearing that sentence specific.
Speaker 3 (01:37:40):
Yeah, I need a cigarette.
Speaker 2 (01:37:43):
I'm gonna start smoking today. And she's fucking happy. Oh
my god, that is That is so funny, brutal place
to land, so fucking geez the way.
Speaker 3 (01:37:55):
Hopefully they're all all of their money's gone, but they
probably squirreled away millions and stuff for themselves. Although at
least one of the articles I've read says that like
his net worth is effectively zero now, but I don't
think anyone actually knows what his net worth is right
now and how much he got, Like his company went
from valued at thirty two billion dollars to most recently
(01:38:17):
there's something like six hundred and fifty thousand dollars in
actual assets left. But I also kind of think he
and the others probably have millions or tens of millions
that they set aside in shady ways for themselves.
Speaker 2 (01:38:31):
Wow. Oh yeah, I mean well that does sound like
what the Beanie Babies guy would do. And that is
my yardstick for morality. That is so that that is very,
very scary to consider. I feel like guys like that.
The thing that is, I mean, I don't know, whenever
you hear about like, it is so karmically satisfying to
(01:38:54):
note that someone like SBF can can be completely bottomed
out and like destroyed by something like this. But the
thing is like, when when rich guys like that lose everything,
they just come up with worse, more hateful ideas and
then come back. And that is like always what kind
of scares.
Speaker 1 (01:39:13):
Me about that?
Speaker 3 (01:39:14):
Yeah, it is so funny. I don't know, Jamis, I.
Speaker 2 (01:39:19):
Don't know what the solution is. That's not like I
want them to have money again. I just you know,
how can you make someone say less again? I think
the podcaster's dilemma.
Speaker 3 (01:39:30):
Look here here's where I'll land on this. You know,
I don't think I don't think people should be thrown
into cages generally unless there's literally no other way to
stop them from harming folks, so and I don't think
that's the case with these people. So instead, I think
the actual solution is to close from the outside all
(01:39:52):
of the doors to that the fucking rich person apartment
complex they occupy in that Bahamas, de lock it from
the outside, and once a week drop in food and
necessities via a helicopter and never let them leave or
use the internet again. Yeah, them all just be with
their friends and they're in their weird little compound going
(01:40:14):
increasingly insane with their Chinese harem shit.
Speaker 2 (01:40:17):
I don't know.
Speaker 3 (01:40:17):
I guess that's another kind of prison. But if we
film it, we can make money.
Speaker 2 (01:40:22):
But then SBF might may have to face his worst fear,
which is reading a book.
Speaker 3 (01:40:27):
Yeah, I guess. Like my serious answer is what do
you do to people like this, is you stop them
from ever being able to have access to money again,
or start companies again, and hopefully eventually they find something
to do that actually helps human beings and is like
like of any kind of use, like working at a
(01:40:47):
grocery store, that's a real benefit. People need to get
food and people need like that's a respectable, honest way
to make a living, and if any of these people
were to get a job working at a safe way,
they would be providing an infinitely greater benefit to the
human race than they could ever have performed.
Speaker 2 (01:41:06):
Perhaps a bit more of that ethical side of the
ethical altruism you're looking. Yes, I'll be honest. I did
not come to the recording today with a solution for
the prison industrial complex.
Speaker 3 (01:41:15):
But I don't have it.
Speaker 2 (01:41:16):
I still don't like it. I still don't love it.
Speaker 3 (01:41:21):
Yeah, yeah, I have no solution. But you know what
I do have, Jamie, what you're pluggables?
Speaker 2 (01:41:30):
No, you don't. I have those?
Speaker 3 (01:41:32):
Well, you have them, but I'm letting you have them.
Speaker 2 (01:41:35):
Oh my god, I mean I'm giving.
Speaker 1 (01:41:37):
I mean I could probably do them. If nobody wants
to take this job, I'll.
Speaker 2 (01:41:41):
Do I can, I'll do them. Oh, Sophy, do you
want to do them? Yeah?
Speaker 1 (01:41:45):
I mean. You can pre order Jamie's book and that
that is linked in her Instagram bio true. You can
follow her on Instagram at Jamie christ Superstar, and you
can follow her on Twitter a Jamie leftis Help if
Twitter is still around. She has a podcast that she
co hosts with Caitlin Dorade called The Bechdel Cast. You
(01:42:07):
should listen to her many limited run series, including her
most recent run, which is Ghost.
Speaker 2 (01:42:12):
Church and Sophie produced along with the Vitel Cast and
everything every podcast on the planet.
Speaker 1 (01:42:18):
This has now become a plug for me. Do I
get everything, Jamie?
Speaker 2 (01:42:23):
Yeah, that's exactly what I would have said, except worse.
Speaker 1 (01:42:27):
Yeah, So pre order Raw Dog.
Speaker 2 (01:42:31):
Yeah, thanks, Sophie. Yeah, pre order Raw Dog. If you
listen to the hot Dog episode of Bastards and didn't
like it, you did like it, now buy the book.
Speaker 3 (01:42:41):
Yeah, legally, you did like it. And if you disagree
with that statement, we will send the CIA to kill
your family. Yeah.
Speaker 2 (01:42:49):
That, and let's make sure to attribute that quote to
Robert Evans specifically. There we go, And.
Speaker 1 (01:42:55):
Speaking of Robert Evans specifically, Robert Evans specifically, Margaret kill
Joyce specifically, and myself, we'll be doing a Behind the
Bastard's virtual live stream show on December eighth. You can
get tickets at moment dot co slash bTB.
Speaker 3 (01:43:12):
Yeah. Also, I have a substack now because Twitter's not
doing great. So that makes me so sad. Why I
like writing things. I got to write a thing last
night to write more things.
Speaker 2 (01:43:26):
I subscribe better for me than being on.
Speaker 3 (01:43:28):
Twitter all the goddamn time. Yeah, you can find it
at shatterzone dot substack dot com. Go there and I
will be writing. I'll try to write something every week.
Maybe I won't, maybe all of this will will fall apart,
be lost in time like tears in the rain. Or
maybe you'll get a new thing for me every week.
There's no way to know.
Speaker 2 (01:43:47):
Every time. Half the time when I get something from
someone's substacked, because I had subscribed to quite a few,
But every time I get a message from someone substack,
it's always like, I'm so sorry. I'm like, I didn't notice.
Just give me the content and I'll.
Speaker 3 (01:43:58):
Send you there about Yeah, yeah, it's fine. Look we all.
I don't know. I've been meaning to write more stuff
as opposed to just tweeting shit posts. So maybe it'll happen.
Maybe it won't. There's no way to know. Perfect Welcome
(01:44:19):
to the movies you liked as an adolescent and are
now ashamed of shamecast. I'm Robert Evans, and today in
the seat of eternal self hatred Jamie Loftus. Jamie, you
have just admitted, prior to the show starting that you
once loved the Dana Carve vehicle Master of Disguise, What
(01:44:40):
do you have to say for yourself?
Speaker 2 (01:44:41):
I feel fucking sick with myself, Rubert. I haven't been
able to sleep in the twenty years since its release.
In my defense, it came out on my birthday, which
I feel like I had a lot to do with
why I considered it my favorite movie. I felt a
kingship with it.
Speaker 3 (01:44:57):
Yeah, birthdays are like a performance in enhancing drug for
movies that you see when you're eleven, Absolutely true, the
blood doping of positive movie memories.
Speaker 2 (01:45:08):
And furthermore, it's the most famous children's movie that was
shooting on nine to eleven, and so I think in
that way, so it felt like it would have been
disloyal to my country to say a word against The
Master of Disguise, particularly the turtle Turtle scene. However, you know,
I think the movie certainly doesn't hold up, and I
(01:45:30):
feel fucking sick with myself every single day.
Speaker 3 (01:45:34):
Yeah, I wonder because famously, Dana Carvey was dressed as
the Turtle Man when those planes hit those towers, and
James Cameron was twenty thousand feet below sea level exploring
the bottom of the ocean, and I wonder if they
ever cross paths at like a Hollywood event, and started
talking about nine to eleven, So what are you up
to on that day?
Speaker 2 (01:45:56):
And then Marl Alberg just leans in apropos of thing
and is like, if I was there.
Speaker 3 (01:46:02):
I would have stopped it. Ah, Jamie, you know I
I I can honestly say nine to eleven was the
first time I felt like this country let me down
because it delayed the release of the seminal Tim Allen
film Big Trouble, which uh it sure it sure did
features a classic Patrick Warburton performance. By the way, it
(01:46:23):
sure do. Goddamn right. You see his ass, everybody, If
you want to see Patrick Warburton's ass, it's in that movie.
Speaker 2 (01:46:30):
He is so underrated. I absolutely love love that man,
A total kid. The talent.
Speaker 1 (01:46:37):
Friends just to say, this is behind the this is behind.
Speaker 3 (01:46:41):
The Bastards, a podcast about none of the things that
we were talking about. Uh, and today, actually, Jamie, I've
got you back in the hot seat, back in the office,
which is more of an ephemeral feeling than a physical space, making.
Speaker 2 (01:46:57):
Me an answer for my sins right off the jump.
Speaker 3 (01:47:00):
Yeah, by talking again about our friend Sam Bankman Freed,
who you and I chatted about right after his life
collapsed last year, and I like we should do an update.
Speaker 2 (01:47:11):
I think we should too, because I'll be honest, I
have done truly everything I can to avoid knowing more
about him, so I would say I know basically nothing
about him since we last spoke.
Speaker 3 (01:47:22):
Yeah, it's amazing because normally, you know, I'm an empathetic being,
Like Sam has a face that I've just always wanted
to hit from the first time I saw a picture
of him. And normally when somebody goes through this much shit,
when like their life is this ruined, right, like I might,
I feel a little less like hitting them because the
world has hit them, but I still kind of want
to sock him in the fucking jaw. Every time I
(01:47:46):
see this guy.
Speaker 2 (01:47:49):
Let me just check it. Has his face changed.
Speaker 3 (01:47:53):
He wears suits sometimes now when he goes to court,
he's not wearing the basketball shorts.
Speaker 2 (01:47:58):
Helpful and well that's god. Yeah, that's like two different
versions of an embarrassing, desperate way to present. Okay, he's
wearing a he's wearing a suit.
Speaker 3 (01:48:08):
Now, Yeah, he's wearing a suit. Now it looks like shit,
But whatever, of course, I try not to judge people
on how they look unless that's part of their con
And Sam as a guy for whom the dressing like
a slab was always part of his his like tech
bro genius, you know, persona that he was putting on
(01:48:28):
like it'd.
Speaker 2 (01:48:29):
Be fooled every puce a mug shot.
Speaker 3 (01:48:35):
Oh yeah, yeah, there's god. I believe there's a mugshot
of him out at this point, certainly when he went
at the Bahamas. So when we last left our buddy
Sam in November of twenty twenty two, he had been
arrested in the Bahamas and extradited to the United States,
where he was charged with so many financial crimes that
he might theoretically spend more than one hundred and fifteen
years in prison.
Speaker 2 (01:48:54):
Wow.
Speaker 3 (01:48:55):
Now, in the days and months since, a lot has happened,
and a lot more has come out about how the
former crypto mogul behaved before and after his fall. I
want to start with some of the latter information, because
by far the most entertaining story to drop as a
result of these serried legal filings against Sam is that
he was using the nonprofit arm of FTX to attempt
(01:49:16):
to buy a sovereign nation he could use as an
apocalypse shelter. That is by far the funniest story that's
dropped about these guys in the days months since.
Speaker 2 (01:49:26):
What was the plan there?
Speaker 3 (01:49:28):
Oh, that's a great question, Jamie. So let's talk about
the island of Nauru. It's an island in the southwest Pacific.
I think it's about twenty one hundred miles away from
the coast of Australia, which, given the fact that Australia
is really out in the middle of nowhere, is pretty
close to Australia. It is presently the world's smallest island nation.
(01:49:49):
It's got a population of about twelve thousand or so,
not a ton of people, and as an incredibly tiny country,
one of its primary assets is simply the fact that
it is a sovereign nation. There's things that countries can
do then nothing else can do, like issue certain kinds
like passports and visas, and do certain kinds of things
with banking. Right, So, if you were a really tiny
(01:50:09):
country that doesn't have like a shitload of natural resources,
one thing you can export is the benefits of your sovereignty.
To say, really rich people who who might want certain
things that you can.
Speaker 2 (01:50:21):
Do as a country plan is coming to get So.
Speaker 3 (01:50:26):
There's a number of ways in which Naru has kind
of taken advantage of this to get by. One of
them is that they've sort of sold access to their
land to Australia to use, so that Australia has used
them for years as an offshore processing center for asylum seekers.
I think this stopped most recently in twenty nineteen, but
there's been a couple of waves of this and it
(01:50:46):
was not a pleasant place right. Conditions were so brutal
in sort of the offshore processing center on Naaru from
twenty twelve to twenty nineteen that several residents carried out
like deadly forms of protests, sowing their own lipshut or
lighting themselves on fire as a protest of the conditions
they were facing. Pretty ugly scene. In the late nineteen
(01:51:09):
nineties kind of prior to this period, Naru was the
chief money laundering location for the emerging Russian oligarch class.
They helped a lot of these oligarch types you've heard
about in the context of Putin launder about seventy billion
dollars in the ill gotten funds during the early stages
of the Russian Federation.
Speaker 2 (01:51:25):
I have a good bastard's cameo.
Speaker 3 (01:51:27):
Oh yeah, no, Naru, Naru's Naro's adjacent to a whole
lot of shitty people. Great, yeah, here, it's a lovely place.
Naru was also designated a money laundering state by the
US Treasury in two thousand and two, which led to sanctions,
which I think is probably why they moved to like
letting Australia offshore migrants there for a while. And since
Australia stopped doing that in twenty nineteen, Sam Bankman Freed
(01:51:51):
and his fellow effective altruists felt like they might have
had an opportunity there, right, Like Naru's kind of looking
for some new cash flow. They're looking for a sovereign
nation to do some things.
Speaker 2 (01:52:00):
For well, it's it's an opportunity to be effective.
Speaker 3 (01:52:05):
Yeah, yeah, to be effectively altruistic towards yourself. Specifically, Sam
and his brother Gabriel Bankman Freed is actually the guy
sort of like organizing this attempted endeavor using ftx's charitable
donation's arm and their goal was to purchase the entire
island in order to construct what Gabe called a bunker
(01:52:26):
shelter that would be used to quote ensure that most
effective altruists survive in the event that between fifty percent
and ninety nine point ninety nine percent of the world
population perishing a catastrophe.
Speaker 2 (01:52:38):
Jesus, And that's like a pretty common this. I feel
like the gabe of the situation is a very common
character in batle, like just the devious brother. I mean,
I hate the bastard most of all, but I really
detest the devious brother as well. There's just it just
weeks of insecurity. We get your own grift.
Speaker 3 (01:52:57):
Man, especially since they're framing it not look, you know,
when you got like a guy like Peter teal right,
and everybody knows Peter Teel's got like an evil, rich
guy bunker to wait out the end of the world
if it happens, And like, fuck Peter Teal. But at
least that's Peter Teel's not pretending I have a bunker
to like save the world by putting aside just the
best people. He's like, no, I'm a giant piece of
shit and I'm gonna save myself if things go wrong. Okay,
(01:53:19):
Like fuck you, Peter Teal, But at least it's honest they're.
Speaker 2 (01:53:22):
Framing it as like a blood bunker for just boys.
Speaker 3 (01:53:25):
Yeah, it's me and my blood boys. Jab is like, no,
we have to in the event there's an apocalypse, we
have to save all the eas because they're the best
people and that's what's best for the world. That will do.
We're utilitarians, right, The greatest good for the greatest number
of people is to save all of the best people,
which is me and my friends, the other finance kids
(01:53:45):
who call themselves effective altruists so they don't have to
feel bad for the fact that all they do is
play the stock market like every other piece of shit anyway.
Speaker 2 (01:53:53):
So lucky that they all met each other.
Speaker 3 (01:53:55):
So lucky all the good people. I would love, I
would love, honestly, Like, look, when the when the strike
is over, somebody at a network, bring me on. I
will write you a banger fucking script about an apocalypse
where just the EA guys are left in their bunker
trying to figure out the.
Speaker 2 (01:54:12):
Signy okay, another incentive to end the strike. That would
fucking rip.
Speaker 3 (01:54:18):
Yeah, yeah, we could do. We could do quite a tale.
We could have some fun with this one, Jamie.
Speaker 2 (01:54:24):
I don't I think that there would be some effective
altruist like ridiculous enough to do cameos.
Speaker 3 (01:54:30):
Oh yeah, we could get We could get fucking William
mccaskell in there no problem, bring his Scottish ass on board. Yeah,
vain little perverts. Oh and they're all they're all I'm
gonna be honest, kind of stupid. So I bet we
could trick him like you don't have journalistic ethics with
an HBO show. Yeah, we could just say we're bringing
them on for an interview. We could like film around them,
(01:54:53):
liking what was that fucking movie with Steve Martin where
they where they have to film the fake movie around Uh,
what is it, Chris.
Speaker 2 (01:55:03):
Rock Shit, Oh it's the it's oh see now you're.
Speaker 3 (01:55:10):
Now now now it's I know. It's driving me nuts.
We have to figure this out.
Speaker 2 (01:55:13):
Uh, I rewatched it recently.
Speaker 3 (01:55:15):
It holds up.
Speaker 2 (01:55:17):
It does hold up. It's like one of the best movies.
Speaker 3 (01:55:20):
Bofinger fucking dope. We could do a bow finger with
William mccaskell where he thinks he's getting interviewed for a
documentary and we're really making him the bad guy off
our HBO series, bringing Steve Martin to fuck it he's
still he still got it.
Speaker 2 (01:55:37):
Yeah, he's got good politics.
Speaker 3 (01:55:38):
He'd be great.
Speaker 2 (01:55:39):
Yeah.
Speaker 3 (01:55:41):
Uh glad, glad we remembered it. Watch Bowfinger, guys. It
holds up truly.
Speaker 2 (01:55:45):
If you take anything away from today, it really does.
I was shocked at how well it helped.
Speaker 3 (01:55:49):
Yeah, startlingly good movie, pretty good like scientology joke. So
Gabriel Bankman freed ran ftx's charitable donations wing, which included
a ton of money for what many people have characterized
as political bribes. I'm not saying that he was bribing
politicians for FTX. I'm saying that's what a lot of
people have characterized what he was doing as.
Speaker 1 (01:56:11):
Now.
Speaker 3 (01:56:11):
That much has been known for a while, but it
was not until the current management of what remains of
FTX sued Sam. Because again there's new management that's trying
to recover as much money as possible, and they're throwing
Sam under the bus because why wouldn't they. And that's
how all this stuff got revealed because they have all
of ftx's internal communications, so they found a bunch of
shit like that Sam was having Gabriel try to buy
(01:56:33):
the island of Nauru. Now it is unclear how serious
their attempt to buy this island was a representative of
the island's government has been like, no, no, no, we
were never putting our island for sale. This was never
a thing that was going to happen. And maybe that
is the case, maybe they were just being idiots fucking around.
But in a memo between Gabriel and an FTX officer,
(01:56:54):
the discussion was centered around the idea of buying the island,
of being in control of it as a sovereign country,
not just purchasing land.
Speaker 2 (01:57:02):
So I mean, that's a lot of work for a bit.
Not that I put it past him, but I'm just like,
it's not sounding like, yeah.
Speaker 3 (01:57:09):
No, I think I don't put it past It's possible
that Naru whatever is telling whatever government officials telling the
truth they were never considering selling the island. But it's
also possible that Gabe at all believed they could buy
the island right right.
Speaker 2 (01:57:24):
They may in fact be that dumb yeah yeah, yeah.
Speaker 3 (01:57:28):
Now, there were discussions between these FDx guys about using
Naru as a base for human genetic experimentation. You get
the feeling that what their goal was to create modified
posthuman godlike bodies for their fellow effective altruists so that
they can live forever and dominate mankind after the collapse
is undying immortals that just.
Speaker 2 (01:57:50):
Sent the ugliest photoshopped image like it was involuntary, how
quickly the like no neck, huge chest photoshops super.
Speaker 3 (01:58:03):
So absolutely absolutely, Hey, yeah, it's a It sounds like
a mix between uh, that one video game with the
big robot dinosaurs and those sci fi books by that
guy who wound up being real anti Muslim.
Speaker 2 (01:58:19):
But yeah, yeah, it could be a lot of guys.
I wonder which guy.
Speaker 3 (01:58:23):
Oh it's I think uh Ilium and olympos Uk. I
forget the name of the author, but the premise of
the book is that like in the future, a bunch
of rich guys turn themselves into gods and decide to
like recreate the Trojan War with themself as the Greek gods,
and they like resurrect a bunch of dead archaeologists to
make sure that they get the details right. It's it's fun,
(01:58:44):
it's it's it's quite a series, except for the weird
moments of bigotry. Oh that good stuff. Dan Simmons I
think is the author anyway, So yeah, they're talking about
we want to create a human genetic experimentation based and
they're like, we want to figure out what the sensible
regulations around human genetic enhancement are, but we also want
(01:59:05):
to build a lab. And while they're talking about this,
Gabriel ads cryptically, probably there are other things it's useful
to do with a sovereign country. I don't know what
he means by that, but yeah, probably huh so that
could be as banal is just like money laundering, which
Nauru obviously is quite a history of, or issuing things
like passports. But given the fact that all these dummies
(01:59:27):
are permanently poisoned by a mixture of sci fi fandoms
and weird futurist cults, I think it's safe to say
we all dodged a bullet by the fact that they
never got too far in this scheme. Now, my favorite
thing about this whole idea is how dumb it is
on its face. There are some countries that could act
as really good apocalypse shelters for the super rich, right
Switzerland is one. A lot of rich people have their
(01:59:49):
apocalypse shelters in Switzerland. New Zealand is another. But the
problem for that is that, like Switzerland and New Zealand
are both functional states. Obviously, if you're a billionaire there,
you can have some outsized fluence, but you're not just
going to run everything because there's other interests and like
a functional system of government in place in all of
those places. I think Sam and them were hoping that
(02:00:09):
since Nauru's small enough, they could just utterly dominate the government,
but they ignored the fact that it's like one of
the worst places imaginable to have as an apocalypse shelter.
For one thing, the island does not grow much food,
which means it has to important ninety percent of what
it needs to sustain its very small population.
Speaker 2 (02:00:28):
It has all good. We have so much money, it's okay.
Speaker 3 (02:00:31):
We'll just keep importing it. Yeah. It also has very
little fresh water, and most of its infrastructure is on
the coast and vulnerable to both rising sea levels and hurricanes.
It is very close to the bottom of places that
you would want to have as a shelter. So very funny.
All these people are silly, now, Jamie. The main reason
(02:00:53):
current FTX is revealing all this is that they are
suing the old management of the country, a company i e. Sam,
to try and reclaim a billion or so dollars, They
argue was funneled illegally into nonsense like this and into
the pockets of bankman Freed and his lieutenants in the
months before FTXX collapsed due to insolvency. The lawsuit against
Sam also includes some more confounding lines described in this
(02:01:16):
paragraph from an article on the suit by crypto news
site decrypt. The lawsuit further says that the projects run
by the FTX Foundation were frequently misguided and sometimes dystopian.
These included a three hundred dollars one thousand dollars grant
to an individual to write a book about how to
figure out what humans utility function are, as well as
a four hundred thousand dollars grant to an entity that
(02:01:38):
posted YouTube videos related to rationalist and effective altruism material,
including videos on grabby aliens. Now, does that all seem
like nonsense to you, Jamie?
Speaker 2 (02:01:50):
I don't know, Robert. Let's hear them out about the
grabby a bi.
Speaker 3 (02:01:54):
Don't worry, I'm going to explain all of this horseshit
to you.
Speaker 2 (02:01:58):
Uh my god, Yeah, kind of rick and morty ass nonsense.
Speaker 3 (02:02:03):
It is some rick and morty ass nonsense. It's much
dumber than anything in Rick and Morty because at least
some of the people there understand story structure.
Speaker 2 (02:02:12):
Unlike them at please understand that it's a joke, that
it's a bit.
Speaker 3 (02:02:16):
Yes. Yeah, so if you aren't terminally adhered to one
of the stupidest subcultures in the broader tech sphere, that
probably does seem like nonsense, and it is. But let's
start with the bit about paying someone three hundred grand
to write about what a human's utility function is. Now, yes,
what is a utility function? Great question? In economics, a
(02:02:36):
utility function is the measure of welfare or satisfaction of
a consumer as a function of the consumption of real
goods like food. In simple terms, it's a way of
describing the satisfaction or other benefits gained by consuming a
specific resource. This is important to rational choice theory, which
is a theory that states that individuals use rational calculations
(02:02:56):
to make rational choices to achieve outcomes aligned with their
own objectives. Now, most people who aren't economists think that
talking about the economy this way is silly, because people
are not in fact rational actors, and in fact, we
make shitty decisions all the time, guided by misinformation or
pressure that causes us to inaccurately interpret the potential value
of something like a college education versus the cost of say,
(02:03:19):
student loans.
Speaker 2 (02:03:20):
Right, one could argue that Sam Bateman, Freed and co.
Are a great example of a wonderful example of the irrationality.
Speaker 3 (02:03:29):
But that's economics. This is not an economics podcast. I'm
not an economist. And the way that Bankman, Freed and
his fellow Eas talk about utility functions is not the
same as how economists talk about it. Right, So when
economists are talking about this, it's part of how to
kind of figure out why people might make rational choices
by understanding like the value of sort of the ranked
(02:03:51):
like preference they give to certain things that they might
expend resources on. Broadly speaking, when Bankman, Freed and Eas
talk about utility functions, what they mean is something even
more abstract. And I'm going to quote from a summary
from a write up on a effective altruism dot org,
a website you should avoid at all costs. Quote.
Speaker 2 (02:04:09):
I'm really not looking forward to finding out how this
somehow relates to the grabby aliens.
Speaker 5 (02:04:15):
I'm sorry, in a very dumb way Jamie quote okay,
good eas and rationalists love dropping the term in every conversation.
Speaker 3 (02:04:23):
Using the term utility function can be immensely helpful when
aiming to maximize positive impact or do the most good.
The concept of a utility function provides a systematic way
to quantify and compare the potential benefits of different actions,
thus helping to guide decision making toward the most effective outcomes.
By representing values, goals, or beneficial outcomes numerically, utility functions
(02:04:43):
allow for a structured comparison and prioritization of actions. If,
for example, your goal is to alleviate global suffering, you
could assign values to different charitable actions based on their
estimated impact, thus creating utility function. This function can then
guide you to allocate your resources like time or mone
money where they will generate the greatest utility or good. Now,
that just seems like you're saying you should try to
(02:05:06):
figure out how your money's gonna be spent best right
before you spend it. But that's not actually what they're saying.
What they are doing here is they are they are
set like a utility function. This context is a way
of assigning a number that you have made up. There
is no objective value to this number. There's no rigor
to this. You are making up a number to determine
(02:05:26):
the value of spending your money in certain ways. And
you are doing this so that whatever it is you
want to do with your money, you can justify numerically
as the scientifically best way to spend your money. So
I see you can you can argue in this way
by assigning these values whenever wonky way you want. Now,
me paying taxes to fund roads and a healthcare system
(02:05:48):
is a shitty use of my money because it doesn't
optimize this thing that I consider to be of higher
long term value. And the thing that's a higher long
term value to me is spending money on fucking space
travel research so that I can be a demi god
on Mars. Right, that's best for human beings in the
long term. So in utilitarian speak, you know, the greatest
good for the greatest number of people is that's getting
(02:06:09):
to Mars as opposed to feeding starving people. Right now,
you know the utility function of getting to Mars is
much higher, So that's where our money ought to go.
Speaker 2 (02:06:16):
Well, it's all, yeah, it just like comes down to
like the great The greatest good is me hah, smiling
a little bit.
Speaker 3 (02:06:23):
Yeah, exactly. There's literally some writing on that in that vein.
So it is through math like this that EAS are
able to look at at a world where millions face
death by famine or disease or rising sea levels and
say the best way to help the planet is for
us to become finance Bros. And then spend our money
investing in AI companies or whatever. The fundamental selfishness of
(02:06:44):
this whole community is made clear when you read the
essays these people write on their websites, like less Wrong,
a blog founded by self declared AIX Yeah, Eliza Yidkowski.
Yidkowski is a rationalist, which is a related subculture to
the EAS. There's a lot of lead over Sam and
a lot of his people were rationalists or rationalists adjacent
(02:07:05):
and Yeah. To give you an idea of how these
people talk about utility functions, I'm going to read an
excerpt from an article on this website titled your utility
function is your utility Function? By David Udell. I've been
thinking a lot lately about exactly how altruistic I am.
The truth is that I'm not sure I care a
lot about not dying, and about my girlfriend and family
(02:07:26):
and friends not dying, and about all of humanity not dying,
and about all life on this planet not dying too.
And I care about the glorious transhium in future and
all that and the ten to the fiftieth power or
whatever possible good future lives hanging in the balance. And
I care about some of these things disproportionately to their
apparent moral magnitude. But what I care about is what
I care about. Rationality is the art of getting more
(02:07:47):
of what you want, whatever that is, of systematized winning
by your own lights. You will totally fail in that
art if you bulldoze your values in a desperate effort
to fit in or to be a good person, or
to in the way you're a model of society seems
to ask you to. So you see what he's saying,
if you read between the lines there, what the most
rational thing for me to do is whatever makes me feel.
Speaker 2 (02:08:07):
Best, whatever gives me a little smile.
Speaker 3 (02:08:10):
Don't let people shame you for spending your resources, you know,
entirely on yourself and your own whims like you're actually
a hero.
Speaker 2 (02:08:18):
If you do that is it is so fascinating to
me to watch, like, I don't know, it seems like
this like self conscious reflex where they feel the need
to define rational by something that is more closely aligned
with reality and then immediately be like, but what that
(02:08:41):
actually means is yeah, that's odds.
Speaker 3 (02:08:45):
The core of it is always being able to say that, like, well,
if you suggest the number one, I have a responsibility
to other people, and that that responsibility is to some
extent out of my hands, which is what we all
say when we're in a society. Right, I don't have kids.
I don't have a choice not to spend some of
the significant amount of money I pay in taxes educating
other people's kids. Now, I'm not a complete piece of shit,
(02:09:07):
so I'm fine with that because, like I've I've done
very well for myself, and kids need educations. That's just
a nice way for the world to work. But these
people are more of the feeling that, like, no, I
should do whatever is possible to avoid paying for, you know,
a public school system. And in fact, I'm often going
to advocate for some sort of like weird voucher based
(02:09:29):
system that allows me to not fund public schools because
the greatest good for the greatest number of peoples, for
me to ensure that like me and my rich kid
friends all get to send our kids to special schools
where like what you know, fuck anybody else, Like I
don't have any other responsibility for the broader population. All
that actually matters is like me maximizing you know, my
(02:09:51):
own personal happiness. But I still want to feel like
like I'm a hero for doing it. Right, if I
avoid paying taxes in order to like spend all of
my mind money investing in open AI so that I
can like take people's jobs away, like I want to
feel like a hero for doing that, because what I'm
arguing is that it's best for you know, the people
with five hundred years from now, and there's going to
(02:10:11):
be more of them there than there are today. So
I'm a hero for like getting rich off of this
company today.
Speaker 2 (02:10:17):
You know, yes, well, because there's first of all, there's
definitely going to be people in five hundred.
Speaker 3 (02:10:22):
Years, sure, for sure, definitely, Garrett, if these eas get
their way, sure yeah.
Speaker 2 (02:10:28):
Yeah, yeah. If we are doing the most good there,
that's so fucking exhausting. I mean, it does, like your
Peter Teel example, not that you know, doing evil is
good in any capacity, But the most exhausting kind of
evil is the one that also insists on you validating
that it's not actually.
Speaker 3 (02:10:47):
Evil all that time, that it deserves to.
Speaker 2 (02:10:49):
Shut the fuck up and ruin my life or don't.
Speaker 3 (02:10:53):
It's the same thing with a lot of these fucking
right wing media grifters, where like it's not enough for
them to be rich, it's not enough for them to
like get their way politically. They feel like they have
like an ethical their owde, like being cool and respected,
And it's the same thing to like, these people are
all finance schools. They are all like actively fighting to
(02:11:14):
avoid paying taxes and to be able to concentrate ever
more power in an ever smaller number of people, to
destroy the lives of you know, artists and people who
are like working folks in order to like make more
short term profits. This is all what they actually care
about personally, but they want to feel like Gandhi while
they do it right, because what they're doing is guaranteeing.
(02:11:38):
What they'll argue Jamie is that like, well, you know
this may hurt this company, you know, me getting involved
in this company. Sure, we may destroy a lot of
jobs in the short term, but by doing so, we'll
be able to make sure that the AI we build
that eventually becomes our god is one that cares about
the future of humanity, and that's better for the most
people in the long run.
Speaker 2 (02:11:56):
Yes, and history should remember me as the greatest man
to ever live. And people will you know, that'll be
on television someday when my computer is writing every television show.
Speaker 3 (02:12:06):
Fucking hate these people. So yeah, it's it's cool stuff.
And I think the fundamental selfishness of these people, because
all that effective altruism and rationalism are really about is
by is creating a made up system of numbers to
justify you pursuing your own benefit as like science right,
(02:12:26):
as like scientifically rational. This is really.
Speaker 2 (02:12:29):
It's not as if that doesn't, you know, like math
problems to serve a very small group self interest. It's
not as if that doesn't exist outside of this circle.
But it's just like bizarre, how uniquely like they lack
any sort of self awareness or I don't know, it's
(02:12:51):
just they're so fucking annoying, is what I'm trying to say.
Speaker 3 (02:12:54):
That's very true, Jamie. And yes, this is all really
clear when you look at how the movement. The EA
movement treated Sam Bankman freed before and after his fall
from grace. If Effective Altruism can be said to have
a pope, and it can, because all of these Silicon
Valley philosophical movements are just Kirkland Brandt Catholicism. That pope
is Will mccaskell, an Oxford moral philosopher and co founder
(02:13:17):
for the Center for Effective Altruism. When FTX collapsed and
Sam got arrested, he was quick to put out a
statement of outrage. I don't know which emotion is stronger,
my other rage at Sam and others for causing such
harm to so many people, or my sadness and self
hatred for falling for this deception. Now. The only reason
I would hesitate to call this horseshit, Jamie, is that horseship,
(02:13:37):
by virtue of being inanimate waste, possesses a fundamental honesty
that mcaskell is incapable of.
Speaker 2 (02:13:44):
Yeah, fuck yeah, Robert, that's the bitchiest thing I've heard
you say in a while.
Speaker 3 (02:13:48):
That's ACU. Thank you. He and SBF met back at
MIT when Sam was an undergrad. Mcaskell convinced him that
he could maximize his impact in humanitarian causes by earning
to get, you know, making as much money as possible
so that he can give it away in a way
that presumably will help the world. Now, when Sam ultimately
launched Alimeter Research, it was an EA project from the start,
(02:14:11):
staffed by Sam's friends in the community. One software engineer
told Time almost everyone who came on in those early
days was an EA. They were there for EA reasons,
says Naya Buscal, a former software engineer at Alameda. That
was the pitch we gave people, this is an EA thing.
Speaker 2 (02:14:27):
I know, I once again thinking about sports video games,
and I know it's not sports video games it is.
Speaker 3 (02:14:33):
It is basically a sports video game.
Speaker 2 (02:14:35):
I just had the flashback to the Yeah interview you did.
I was thinking, Okay, it's true, it's true.
Speaker 3 (02:14:45):
In the early days, Sam's pitch was that fifty percent
of the company profits would be donated to EA causes,
and the initial round of investing that got the company
off the ground was funded entirely by rich EA types. Publicly,
mccaskell asked about talked about Sam like he was the
EA messiah, probably because ftx's Future Fund provided a huge
amount of support for his movement, and just nine months
(02:15:07):
in twenty twenty two, the Future Fund, run by Nick Bexted,
a moral philosopher who used FTX money to support various causes,
gave more than one hundred and sixty million dollars in
other people's money funneled through FTX two effective altruism, including
thirty three million dollars to organizations that mccaskell had a
direct interest in. So that's why mccaskell spoke so positively
(02:15:29):
about Sam, which is made even more fucked up when
you realize that, like other people in the EA movement,
had started warning mccaskell about Sam Bankman Freed and about
him being a con man as early as twenty eighteen.
And I'm going to quote again from Time this is
about from the very start of Alimeter research. Within months,
the good karma of the venture dissipated in a series
of internal clashes, many details of which have not been
(02:15:52):
previously reported. Some of the issues were personal. Bankman Freed
could be dictatorial, according to one former colleague. Three former
Alimeda employe, he's told Time he had inappropriate romantic relationships
with his subordinates. Early Alameda executives also believed he had
reneged on an equity arrangement that would have left Bankman
Freed with forty percent control of the firm, according to
a document reviewed by Time. Instead, according to two people
(02:16:14):
with knowledge of the situation, he had registered himself as
sole owner of Alameda. So basically, Sam has unethically taken
control of the firm and all of the money invested
in it, and he is like fucking his subordinates.
Speaker 2 (02:16:27):
It was again, he's also a sex pest. He's also
a sex surprising that I've ever heard in my entire life.
Speaker 3 (02:16:33):
Taking money from people's what are effectively bank accounts and FTX,
and he's using it to prop up the value of
different crypto tokens on Alameda to make shit look like
it which is money laundering, right, that's all. It's like
theft and money. It's fraud, you know?
Speaker 2 (02:16:48):
Or is it the future Robert remember that?
Speaker 3 (02:16:50):
I mean, yeah, remember what Larry David taught us all?
Speaker 2 (02:16:55):
I know there were some really unforgivable Uh.
Speaker 3 (02:16:59):
It's pretty funny, it is.
Speaker 1 (02:17:01):
Of all.
Speaker 3 (02:17:01):
It is like I am on team, Like I can't
be angry at Larry David because honestly, if Larry David
advertises a financial product, and you decide he seems like
a good guy to this seems like a credible company
to me. I don't know. That's a little bit on you, right,
that's a little bit on you.
Speaker 2 (02:17:21):
Oho else was in that commercial? There was? That was
a that was a damning commercial.
Speaker 3 (02:17:26):
Yeah it is.
Speaker 2 (02:17:27):
It is.
Speaker 3 (02:17:27):
I still say a restaurant, Larry David. But that's for
a variety of reasons.
Speaker 2 (02:17:33):
So I'm curious what your other reasons are. But that's
for another day.
Speaker 3 (02:17:36):
Yeah, that's for a different day. So this caused so again,
all of this shit. That is why this is like
everything that was obvious in Alameda in twenty eighteen, This
is all the stuff that he gets arrested for in
twenty twenty two. And when they become aware of this,
of the fact that these fucking is subordinates and money laundering,
a bunch of EA people start raising alarm bells. In
(02:17:59):
two thousand and eight, teams several Alameda executives try to
force him out of the company and accuse him of
gross negligence. Sam wins the power struggle, though, and so
most of the EA management team and half of the
company resign. Now this might be you could see this
as like, well, maybe these effective altruist types were just
in it for the altruism and once they saw Sam
with Shady, they packed up. And that's true for those
(02:18:21):
individual people. You know, there's some decent people who just
got caught up in the movement and they clearly had
some degree of moral integrity, But the broader effective altruism movement.
Speaker 2 (02:18:31):
Incredibly low there, including unbelievable Will mccaskell.
Speaker 3 (02:18:35):
It's Pope never disaffiliated from Sam. Mccaskell was saying talking
about Sam like the second fucking coming up until late
twenty twenty two, and in exchange for laundering Sam's reputation,
Sam sent tens of millions of dollars one hundred and
sixty million to EA causes in twenty twenty two alone.
And that's why mccaskell maintained movement ties with Bankment Freed. Quote.
(02:18:58):
In the weeks leading up to that April twenty eighteen
confrontation with bankman Freed and in the months that followed,
mccoley and others who was one of the executives that
left warned mccaskell, Bexted and Karnofsky about her co founder's
alleged duplicity and unscrupulous business ethics, according to four people
with knowledge of those discussions. Buskal recalled speaking to McCully
immediately after one of mccully's conversations with mccaskell in late
(02:19:20):
twenty eighteen. Will basically took Sam's side, said Buskal, who
recalls waiting with McCauley in the Stockholm airport while she
was on the phone, Will basically threatened her. Buscall recalls,
I remember my impression being that Will was taking a
pretty hostile stance here, and he was just believing Sam
side of the story, which made no sense to me.
So Will again perfectly willing to like throw down against,
(02:19:42):
you know, the more honest people in his movement and
personally threaten them in order to keep the money flowing
to his fancy cause so that he can heal.
Speaker 2 (02:19:50):
Right, and then the second that it becomes, you know,
pr inconvenient to keep the associate. There should be some
maybe there is like a term for that of just
like even the cadence of what you read earlier, of
just like the disingenuousness of like oh I had no idea,
You're like, okay, no.
Speaker 3 (02:20:08):
You knew damn well, what was going on, and you
knew that that's what you were willing to continue pretending
he was a good guy and aligned with your movement
as long as the money kept flowing, right, No.
Speaker 2 (02:20:20):
Lee, Yeah, it no longer serves your best interests to
stand by him.
Speaker 3 (02:20:25):
Yeah, so yeah, now you'll h anyway, So, Sam, we
don't actually know if he's completely bankrupt. We know he
took about a billion in payments and loans from FTX.
He claims not to really have any of that money,
and that he's he's working on getting what assets he
does have to like get try to make a few
more of the investors that they had whole. That said,
(02:20:47):
we know that a big chunk of the money that
he made was funneled into real estate in his parents' names.
So that's fun. Speaking of his parents. One of the
big early mysteries of his case was that when he
gets out on two hundred to fifty million dollar bond,
his parents are signed on to the bail agreement with
their house as collateral. But there were also two mystery
(02:21:09):
co signers and their house we'll be talking about this
in a second is on the Stanford campus. The two
mystery co signers were Larry Kramer, a family friend and
the former dean of Stanford, and Andreas Popke, who signed
a two hundred thousand dollars bond. Popkey is a senior
research scientist at Stanford and an advisor to several Valley startups.
(02:21:29):
So Stanford is very invested primarily because of who Sam's
parents are in this case, which is interesting.
Speaker 2 (02:21:36):
To me and okay, because that was I mean, that
is my outside of his parents, I guess realistically, what
is in it for them by taking that risk.
Speaker 3 (02:21:48):
I think it's actually just that these people are close
with his parents, who are professors at Stanford and deeply
tied into that community.
Speaker 2 (02:21:57):
I can, I mean, I can very much see like
Bouche Bougy parents with influence, being like, Sam's just a boy.
Anyone could have made this mistake. He thought he was
doing the right thing. He got mixed in with the rock.
Speaker 3 (02:22:09):
This has to just be like some sort of crazy
mistake because they can't imagine he just it's so dumb
and blatant, like all he did was rob people in
order to like gamble, right, Like, fundamentally, there's not a
difference between like him taking people's money, claiming that he's
got a sure stock tip and then gambling at Vegas, Like, legally,
there's no difference between that and what he did, but
(02:22:32):
they can't because his parents are like ethicists basically at Stanford,
and I don't think any of the people they are
social with can imagine that his crimes were that venal
and foolish. But you know whose crimes are not venal
and foolish, Jamie, Oh, tell me, Robert, the products and
services that support our podcasts.
Speaker 2 (02:22:54):
Never encountered a product I didn't love.
Speaker 3 (02:22:57):
No, that's exactly right. That's exactly right, and we are back.
So initially, the reason why these other Stanford people are
secret signers on to this bail agreement is because they
(02:23:21):
were afraid that they would be attacked because of how
angry people are at Sam and there was a reason
for this. Shortly after he was sent to house arrest
at his parents home, someone drove their car into a
barricade set up outside of their house on the Stanford campus. Now,
as I stated earlier, both the elder Bankman Freeds are
former Stanford professors and their home is on campus, which
(02:23:42):
has created issues for the school. The university still will
not officially acknowledge that one of the world's most famous
accused felons currently resides in their prestigious, walled academic garden
one Washington.
Speaker 2 (02:23:54):
Stanford, like really fumbleshit constantly.
Speaker 3 (02:23:58):
Oh yeah, yeah, it's absurd. It's because they're not really
that smart.
Speaker 2 (02:24:03):
But well, yes, this again another great case for it.
It's also like, I don't know, just the arguing, like, well,
his parents are like ethesis, how could he be fucked up?
It's like, have you ever met someone with raised as
no offense people of the people to call out any love,
(02:24:25):
but it like they know, they know, they know language.
Speaker 3 (02:24:28):
Yeah, yeah. So one part Washington Post article I found
noted Stanford Law School didn't respond to requests for comment
when asked whether they could confirm a rumor that nearby
student co op had attacked the Bankman Freed home with eggs.
Stanford campus police did not respond. Socially, however, Bankman Freed
is a source of deep fascination. There are party flyers
with his likeness. He's a punchline in campus comedy sketches.
(02:24:51):
Students ride their bikes by on dates. The campus community
is well aware he's there. An annotated map located the
Bankman freed home was posted on a stud and only
social network.
Speaker 2 (02:25:02):
Okay, I be honest. If someone asked you out and
they're like, here's my concept for a first date. We're
gonna go watch Sam Samuel Bankuin Free on.
Speaker 3 (02:25:11):
We're getting We're getting married that night, Jamie, Right, I was.
Speaker 2 (02:25:15):
Like, we're going like, bring a flask, watch one of
history's stupidest villains, and then for each other in a
parking lin.
Speaker 3 (02:25:25):
Absolutely, Jamie, that's that's the dream. That's the dream true.
Speaker 2 (02:25:29):
None of this.
Speaker 3 (02:25:30):
That's effective altruism right there. That's the greatest good for
the greatest number of people.
Speaker 2 (02:25:34):
Sam could stand to learn a thing or two from.
Speaker 3 (02:25:37):
These these theoretical Stanford students. Yes, so if you so again.
Sam's parents are both well respected teachers and experts in
different fields of ethics boast were both were recruited to
the university in the nineteen eighties, and they almost immediately
hooked up. Barbara Freed made a name for herself as
a she's like a philosopher. Basically, her big thing was
(02:26:00):
she wrote a paper dissecting the ethics of the trolley problem.
Whereas Joe Bankman is a finance ethics guy. He writes
a lot about like again, like not breaking the law
with finance shit. It seems to be a big focus.
Speaker 2 (02:26:13):
Of his This is a very obvious thing to say,
but I do in defense of the Bankman family. I
do appreciate when someone really rolls with their last name.
I think that that is a very fun quality. This
is this is an example of it not working out.
I think it's also equally bizarre when someone has a
(02:26:34):
last name where you were like, why are you not
doing that job? For example? Yeah, maybe I've told you
this before. It's one of my favorite facts I've learned
in my life. My childhood dentist's name was doctor Vagenis
and me.
Speaker 3 (02:26:50):
Great.
Speaker 2 (02:26:51):
Nevertheless, Laurie's insisted in going into team for some reason,
I found it infuriating. I'm like, just roll with it, man,
that's your last name.
Speaker 3 (02:27:02):
An ethics board needs to go after this guy. No,
I'm sorry. I know you don't know how to do
this job, but that's your life now. People will trust
you until you figure it out. I do you you
are the speaking of nominative determinism, Joe Bankman would be it.
I wouldn't trust Joe Bankman as a drug dealer. But
I would, I would. I would trust Johnny Cocaine as
(02:27:25):
a financial expert, like like as a stockbroker. Johnny Cocaine, Yeah,
let him invest my money. He knows what he's doing.
Speaker 2 (02:27:34):
I just know. Say what you will about Joe Bankman,
and I'm sure you should. I know nothing about this man,
but at least he got into the right business.
Speaker 3 (02:27:43):
Yeah he does, he does. Now I people talk with
like awe about this guy, Like he's so ethical, he's
such like a decent man. He thinks so much about
doing the right thing. When Wash The Washington Post is
like giving examples of noteworthy things in his past, one of.
Speaker 2 (02:27:58):
The ones that he must talk about ted Lasso.
Speaker 3 (02:28:01):
In two thousand and two, he wrote a tongue in
cheek suggestion on how to avoid a Major League Baseball strike,
and his solution in this paper that's supposed to be
a joke was to levy taxes on teams and players
who struck. That could only be avoided if the players
donated money to charity or the teams agreed to sell
nickel hot dogs Giants fans. Now, I don't know what
(02:28:22):
the joke is here, but it apparently tore it up
among mid upper middle class IVY League finance academics. They
all talk about this is very funny.
Speaker 2 (02:28:31):
Wait it comes up again. Is like remember when he said.
Speaker 3 (02:28:35):
It was in the Washington Post article about this guy,
as like, look at a look at this. This is
like a noteworthy moment from his career. This like bad
joke that he made. But I guess that's what fucking
Stanford people find funny.
Speaker 2 (02:28:47):
Imagine maybe it's hilarious that.
Speaker 3 (02:28:50):
Yeah, I don't get it.
Speaker 2 (02:28:52):
Christ Maybe Stanford people were just like I think it's hilarious,
just like the word hot dog. They're like, oh poor
people food love. Yeah. I'm like, I don't know.
Speaker 3 (02:29:01):
Now, everything you find about these people is like wow,
taught like is their friends talking about life. It's so
shocking that this could happen. You know, these were like
the best people we knew. They were so concerned about ethics,
They raised their sons like little adults, and they were
always talking about utilitarianism. How could this have gone wrong?
I don't know. I feel when I read anecdotes about them,
(02:29:22):
like it's pretty obvious why it went wrong. And to
kind of make that point, Jamie, here's a quote from
an excellent write up by Puck news Quote. Bankman, who
once boasted to a friend that his father had dutifully
recorded every cash receipt, wrote three case books on tax
shelters and tax evasion, becoming one of the country's leading
experts on the subject. One of Bankman's law students in
those early years was Peter Teel, who later told Bankman
(02:29:45):
that his tax law class was his most valuable because
he was able to put a lot of his Facebook
stock in an IRA. As Bankman would later recall in
a podcast, this modest feet of financial engineering would save
Teel more than a billion dollars. So ethics, Jamie, No,
I'm saying a billion dollars in taxes so that Peter
Thiel can spend it given Joe Swastika money to write
(02:30:07):
New York Times columns on racism. Hooray, I love ethics.
Speaker 2 (02:30:12):
Holy shit. Yeah, and I and here I was a clown,
a fool about to be like well, everyone wants to
rebel against their parents. That's probably why he's a cartoon.
Speaker 3 (02:30:21):
Villain when he talks ethics. It's not getting busted for
being a piece of shit with money. That's what it
seems like to me, not an expert on ethics. I
am an expert on being a piece of shit though, so.
Speaker 2 (02:30:33):
Yeah, but you're a far different piece of shit.
Speaker 3 (02:30:36):
Thank you, Jamie.
Speaker 2 (02:30:37):
Thank Can I celebrate that about you?
Speaker 3 (02:30:40):
That is?
Speaker 2 (02:30:41):
But I mean, I'm not, I guess shocked that that
is exceedingly ethical to the Stanford crowd, but wow, damning.
Speaker 3 (02:30:51):
So anyway, Barbara Freed, being a good Liberal, was horrified
by the Trump election and chose to fight back by
founding a political fund right raising group, The Gap, which
was extremely successful during the Trump years and is rumored
to have acted as the model for ftx's own political
donation machine. Both of Sam's parents have seen their reputations
suffer with his arrest, and I'm going to continue with
(02:31:12):
a quote from Puck. Official property records show that Joe
Bankman and Barbara Freed were the named owners of a
sixteen point four million dollar beachside vacation home in Old
Fort Bay, part of a broader real estate portfolio owned
by FTX and senior executives, totaling hundreds of millions of dollars.
They may have stayed there while working with a company
sometime over the last year. Sam said, though he denied
(02:31:32):
knowing any details about the three hundred million dollars worth
of real estate that FTX and his parents bought in
the Bahamas.
Speaker 2 (02:31:39):
Oh, okay, so they know about absolutely everything in thee.
Speaker 3 (02:31:42):
Yeah, it sounds like it sounds like now. Joe and
Barbara have said that they've been working to return the
property to the company for some time, working too. Joe Bankman,
in particular, has hardly been a passive observer in his
son's scandal and may now be exposed to some legal
risk himself. Bankman interviewed and hired the first lawyers for
Alimeter Research back in twenty seventeen, and effectively served as
ftx's first attorney. He handled the inbound that came and
(02:32:04):
made the resulting introduction that helped FTX raise one hundred
and thirty million from his former law student. Private equity
mogul Orlando Bravo spent his free time on ftx's charitable
and regulatory efforts, and was ultimately in the room before
Sam made the fateful decision to sign the documents that
declared Chapter eleven. So they seem very involved in shady themselves.
(02:32:24):
I don't buy Oh, they're so innocent. Their son just broke,
you know, made a mistake or whatever. They're all they
all just didn't think that this was criminal because the
people's money they were taking were poor, and they're fucking
Stanford brats, Like I have no respect for them. I
hope they lose their fancy Stanford house. They heard a
lot of wit.
Speaker 2 (02:32:42):
It's like, yeah, particularly because I mean it sounds like
this was their mo from the start anyways, So why
would we now think that they would be above this
behavior if they're like, oh no, here's a way, Like,
I don't know. It seems like their definition of ethics
is things you can technically get away with.
Speaker 3 (02:33:00):
Yeah, it's not illegal for Peter Teal to get this
billion dollars that he doesn't pay taxes on because you know,
fuckery anyway, mister Bankman and missus Freed have joined now
the expanding cast of disgrace Stanford affiliates. This includes recently
university president Mark Tessier Levine, currently accused of manipulating images
on research papers in a way that is equivalent to
(02:33:21):
falsifying lab data for Alzheimer's research. Obviously, there's also Bastard's
Pod alumni, the Thernos Lady, another famous Stanford disgrace. And
then there's the fact that they're alumni King.
Speaker 2 (02:33:33):
My favorite Stanford disgrace.
Speaker 3 (02:33:35):
Yeah, and then there's all their alumni who have created
companies or helped run them that have shattered the foundations
of our democracy in pursuit of a quick buck. Stanford's
current reputation is so grimy that a Washington Post article
on SBF's associations with the school ends with these lines,
and this is very funny. Jamie Adrian Daub, a Stanford
professor of comparative literature and German studies and the author
(02:33:56):
of what Tech Calls Thinking, sees an encouraging sign in
Stanford being only peripherally involved in the bankman Freed scandal.
That might not have been the case ten years ago.
He notes, when the Silicon Valley heype machine operated at
more of a fever bitch than it does today. Other
than his physical location, it's actually not that connected to
us for once, Dob said, And that way, it's a
sign of progress and also a little bit melancholy. Stanford
(02:34:18):
was a place where the future was shaped, so it's
quite possible that's not happening anymore, that it's happening in
the Bahamas now and only comes to Palo Alto once
it gets indicted.
Speaker 2 (02:34:30):
That's so funny. I'm hung up on other than its
physical location.
Speaker 3 (02:34:35):
Yeah, other than the fact that he's here. He's not
very involved.
Speaker 2 (02:34:38):
That's a big one, Babe, that's a big one.
Speaker 3 (02:34:40):
But it does seem significant to me.
Speaker 2 (02:34:43):
No, if they're happy, I'm happy. That's great.
Speaker 3 (02:34:46):
It's funny.
Speaker 2 (02:34:47):
We have to return to Elizabeth Holmes at some point
because I'm very uniquely interested in her rebranding as Liz.
Speaker 3 (02:34:56):
Yeah, genius Liz, because now I forgot in her horrible crimes.
Speaker 2 (02:35:01):
Yeah, I forgot about all of her crimes once she
had a cool and relatable name and had school and
relatable babies.
Speaker 3 (02:35:07):
Well, shed escape Henry kiss going by allan.
Speaker 2 (02:35:11):
That's true. We can't take away that she's scammed Henry Kissinger. Yeah,
scam people out of their lives. But you know, yeah,
but the Henry Kissinger thing we cannot forget.
Speaker 3 (02:35:22):
I say, we give her six months off as a
result of the Kissinger stick to me. So this brings
us to the subject of what precisely Sam Bankman Freed
has been up to in the nine months or so
since his fall from Grace The short answer is that
he has not had a wonderful time. In January, a
month or so after he was granted bail under house arrest,
(02:35:44):
the Southern District of New York accused him of inappropriately
contacting former and FTX employees in order to influence their
testimony on his case. Sam tried to frame all of
this as part of his ill advised apology tour that
he embarked on last year in the lag period between
FTX collapsing and his formal charges.
Speaker 2 (02:36:01):
Yeah, did he hit the notes out? What happened?
Speaker 3 (02:36:04):
Oh, he's like calling them. Actually, I'm going to read
a quote from Puck in order to describe how he's
illegally contacting people. On December twelfth, the same day he
was arrested in the Bahamas, Bankman freed emailed FTX bankruptcy
CEO John J. Ray the Third, offering potentially pertinent information
concerning future opportunities and financing for FTX and its creditors
and asked to work constructively with Ray in the chapter
(02:36:26):
eleven teen to do what's best for customers. No response. Then,
after his extradition, the crypto mogul sent another email to
Ray on December thirtieth, in which he offered advice accessing
Alameda funds. Still no response. Then, while being summoned to
court in New York, SBF tried Ray again on January second.
Mister Ray, I know things haven't gotten off on the
right foot, but I really do want to be helpful.
(02:36:46):
As I'm guessing you've heard, I'm in NYC for the
next day. I'd love to meet up while I'm here,
even if just to say hi. Ray to not take
him up on this offer, and he has no shit.
He's reached out to several people and it's always like,
I just want to help, you know, get as much
money as we can for the customers. I just want to,
you know, help you deal with the confusing aspects of this.
But it's it's like, you're not supposed to be talking
(02:37:08):
to people when you're in SAM situation who were involved
in the company like this, because they're probably going to
testify against you.
Speaker 2 (02:37:16):
I don't feel convinced that he understands that. I don't know.
He's talking like a fucking spam email.
Speaker 3 (02:37:22):
Yeah, he really is. And in general, Sam has opted
to take all of the actions under house arrests that
are likeliest to cause stress ulcers in his lawyers. In
addition to repeatedly contacting FTX employees, he decided to start
a sub stack where he planned to explain how ft
and It's like, there's that famous line from the wire,
are you taking notes on a criminal conspiracy? But in
(02:37:44):
this point the case, it's like, are you publishing blog
posts about your criminal conspiracy after being indicted?
Speaker 2 (02:37:52):
Such how growing an audience? Roberts, Yeah, lots, it's all
a part of the play.
Speaker 3 (02:37:57):
Yeah, it's like if al Capone had started a New
York Times call him on having people machined gun and
alleys after he'd gone away to fucking Alcatraz.
Speaker 2 (02:38:07):
Hear me out, you guys, it actually makes way more
sense when I explain it.
Speaker 3 (02:38:12):
Yeah, so neither. He only writes, like I think, two
posts before he gives up because they're they're bad. He's
bad and blogging. I tried to read them. He's dog
shit writer.
Speaker 2 (02:38:21):
Look, well, can I I mean not to be aggressive,
but that's most substacked people.
Speaker 3 (02:38:26):
It's, by the way, find my substance, not you, Robert.
Speaker 2 (02:38:31):
There's more. No, I know you have more than two,
but I'm just saying the average friend of mine that
harangues me into you know these damn emails. It's two
and and then they're done anyways, but I'm probably going
to start one that's I'm just being insecure, sure what
(02:38:51):
I wanted to.
Speaker 3 (02:38:51):
Do in a I think you'll beat Sam Bankman Freed, No,
no questions, Ybert.
Speaker 2 (02:38:56):
Your substrack is great, Thank you, sir, Your substack's great.
Speaker 3 (02:39:00):
Thank you, Jamie is this has been wonderful for my ego.
I actually read so Sam's is bad though, and it's bad,
like and part of part of what he's doing is
like he's been charged with a bunch of specific crimes, right,
and the posts that he puts up he does not
(02:39:20):
acknowledge any of the charges against him. He doesn't like
defend himself from them. Instead, he lays out a bunch
of misleading in arcane spreadsheets to try and like argue
that the company shouldn't have collapsed the way it did
and that he like didn't realize like why he didn't
realize it was so bad. He's just he's doing it's
the same thing as like the EA shit we open
(02:39:41):
the episode with, right, this throwing out like confusing piles
of numbers in order to distract people, right, Like, this
is just like chaff, you know. That's what he's doing,
is he's throwing out chaff in the way of a
bunch of poorly formatted spreadsheets. They don't convince anyone that
he's innocent to.
Speaker 2 (02:39:57):
Say, but it sounds like it. Also, it did not work.
Speaker 3 (02:40:00):
It did not work at all. You should not do
this if you are being indicted for numerous financial crimes
bt dubs. So in early February, the judge overseeing Sam's
case forbade him from using encrypted messaging apps like Signal
because he was so frequently trying to talk to other
people who were part of this case with him in secret,
which is illegal. He also got in trouble because he
(02:40:23):
was caught using a VPN, which could have potentially allowed
him to hide his communications. Sam argued he was just
using a VPN to access his international NFL Game Pass account.
Speaker 2 (02:40:33):
So I was like, did he just say he was
trying to want a Canadian Netflix? Yeah, that would be
fucking classic officer.
Speaker 3 (02:40:41):
I just had a lot of shit to tour at, Homie, Like,
you get it, my men. He has since been limited
to a normal flip phone due to his repeated inability
to abide by his bail conditions. Okay, Now, some might
note that Sam has already gotten more second chances than
most accused criminals get with their bail conditions. It seems
(02:41:01):
accurate to say that the leniency he has received gave
him reason to feel as if he could act with impunity,
which is why a couple of weeks ago he leaked
his ex girlfriend's diary to The New York Times, which.
Speaker 2 (02:41:12):
Is take me through the Witch's Why that wasn't what
I thought you were going to say.
Speaker 5 (02:41:18):
I know, I know, nobody thought it was gonna head here,
you know, I mean it's although it seemed like we
were due for another spiteful action against a woman for
seemingly no reason.
Speaker 3 (02:41:29):
Oh, absolutely absolutely. Now I will say I don't like
this woman either. Carolyn Ellison is I think a pretty
shitty person.
Speaker 2 (02:41:40):
We spoke about her last time, right, Yeah, yeah.
Speaker 3 (02:41:42):
She unpleasant lady. She was the former CEO of Alameda.
I've been fucking listening to this podcast called Spellcaster, which
is like a wondery podcast about Sam Bankman Freed. I
don't like it. The woman who does it, like was
at a was at a bachelorette party with Carolyn Ellison
right before the charges dropped and was like, Oh, she's
so smart, She's so and she repeats the same bullshit
(02:42:03):
everyone says about Sam. They were such, they were like geniuses,
and it's like, no, they just like blew out a
bunch of numbers you didn't understand and convinced you they
were smart because they said numbers right, Like, there's nothing
these people have done that is smart.
Speaker 2 (02:42:17):
With situations like that, I'm like, I guess I appreciate
the disclosure, but like, why the fuck were you hired
to do this show?
Speaker 3 (02:42:25):
Well, it's it's because big media is just as tiny
and insular a world full of rich people as finance,
and in fact, a lot of the same families have
people at the times and people and fucking investment banks,
which is why here at cool Zone Media, we exclusively
hire people who used to sell ketamine on their college
(02:42:46):
campuses in order to get by.
Speaker 2 (02:42:47):
You know.
Speaker 3 (02:42:48):
That's that's the that's the Cool Zone guarantee or adderall.
Speaker 2 (02:42:51):
You know, Iah, And I appreciate that you made an
exception for some of us that you didn't have to
be good at it, you just had to.
Speaker 3 (02:43:01):
No, in fact, we will not hire you. If you
were good at Sellyn Drums campus.
Speaker 2 (02:43:06):
Why did you even apply.
Speaker 3 (02:43:07):
To this mediocre part time campus drug dealers? That's our
that's our hiring pool. Yeah, that's our Like, I don't
know whatever, I don't know the names of enough fancy.
Speaker 1 (02:43:18):
New York's really not that big of a stretch, let's
be honest. I'm fucking proud of that.
Speaker 3 (02:43:24):
So Carolyn Ellison, former CEO of Alameda and also Sam's
on again, off again bo she immediately turns.
Speaker 1 (02:43:37):
I don't like you using the term bo but continue
Why not that's what I should I use boo.
Speaker 2 (02:43:44):
I kind of like boo. I was like, I was like,
that doesn't really make sense.
Speaker 3 (02:43:49):
But there they were koboos. So she immediately turned state's
witness and admitted guilt for her share of the illegal
activities committed by Alameda, and she apparently, as a part
as part of immediately rolling, handed over her diary. I
think that's how they got her diaries with part of
the terms of like the Yeah, so it gets introduced
(02:44:09):
into evidence, which obviously Sam I think will get access
to as a result of that, because that's the way
discovery works. I believe that's how he got her diary.
Speaker 2 (02:44:17):
Was she also a Stanford Head?
Speaker 3 (02:44:20):
I don't think she went to I think she her
parents were professors at M I.
Speaker 2 (02:44:23):
T h Wow yeah, losers over it.
Speaker 3 (02:44:27):
Yeah yeah yeah, fucking just hobo University right there.
Speaker 2 (02:44:32):
Until it happens to me. I love when someone's diary
is introduced into evidence, and that brings me back to
Elizabeth Holmes yet again, when she like her her creepy
little sexts with Sonny Belwani ooh, some.
Speaker 3 (02:44:46):
Of the worst sets, some of the very worst sex.
Speaker 2 (02:44:50):
That is maybe the moment that I felt closest to her.
That's when she That's when Liz almost got me, because
she was sending walls of text to this guy and
then he was sending me back. Okay, and it's like,
you know what, brutal.
Speaker 3 (02:45:03):
No, No, she deserves, she deserved a man like Jeff
Bezos who would call her the most unsettling nickname I've
ever heard, you know, but at least he responded a
live girl. That's right, that's right.
Speaker 1 (02:45:20):
I think we we as we we we we we
as a collective blocks that out intentionally.
Speaker 3 (02:45:28):
Yeah, it's very funny. It does make it clear that
he's not a robot because like, nobody, nobody fakes that.
That's that's that's evidence that he feels something. What he
feels is off putting, it's frightening, it's like profoundly unsettling.
But he does feel something.
Speaker 2 (02:45:45):
But unfortunately, yeah, chat GPT could have outdone that in
terms of sounding like a person.
Speaker 3 (02:45:51):
Absolutely. Yeah. So anyway, Sam gets access to her diary
one way or the other, and then he hands her
diary to the New York Times so that they can
write an article about it. Now that as unethical as
fuck and possibly illegal. The prosecution has asked that he
be jailed, that his bail will be revoked because of
(02:46:12):
what he did here. Sam's this is still going on
as we talk. I'll record a little update if he
does go to jail as a result of this. Hey everyone,
Robert here, just wanted to update you that, since we
recorded this episode a couple of days before you're hearing it,
Sam Bankman freed was remanded to custody. He is incarcerated
now and he will remain in jail after violating his
(02:46:34):
bail conditions, until his trial in October at least and
possibly well beyond that, depending on how the charges and
sentencing and all that stuff go. I should note that
kind of the most recent story after that is that
his lawyers requested that he'd be allowed to have his
ADHD medication and depression medication, which he ran out of
(02:46:57):
soon after being taken into custody. The judge is ordered
that he'd be given that medication. Obviously, I'm always in
favor of people who are incarcerated having access to medication,
if anyone's interested. I don't actually think putting Sam in
jails I'm going to do much good. I'm a little
bit more mixed on this than I normally am, just
because of the case of Adam Newman, the wee work
(02:47:20):
guy who got off Scott Free from his giant financial
crimes and is now starting another giant grift company. And
we'll probably fuck with a bunch of other people's lives.
But I do kind of think it's unlikely that we're
going to get much benefit out of this. That said,
I don't really feel for Sam. He had many many
chances not to be in this situation, and he fucked
(02:47:42):
all of them up. So, you know, fuck the guy.
Sam's lauriy's have argued he was not attempting to discredit
a week a witness, but just to respond to a
toxic media environment, which he says unfairly portrays him as
a villain. And I guess we're part of that toxic
media environment. Although Sam retip here, handing your ex girlfriend's
(02:48:02):
diary to the New York Times is a bad way
to seem like, not the villain. That's kind of villain behavior, homie.
That hate to tell you I.
Speaker 2 (02:48:11):
Again, but it's like again, you can imagine his like
doofy loser fucking logic of like, no officer, I was
just being a petty bitch. Is that just the law?
Speaker 3 (02:48:22):
And you're like this, yes, yes it is, actually, sir bruh,
So humorously enough, that is the legal argument his lawyers
are making, and they they kind of have a point
because they're like, look, if you read the New York
Times article based on her diary, he seems like a
piece of shit. So clearly we weren't trying to influence
the prosecution. And like, they do have a point because
(02:48:44):
he does come across as the bad guy in that
article that he made happen. So that's funny.
Speaker 2 (02:48:50):
This is the bad guy in most things.
Speaker 3 (02:48:54):
Yeah, I'm gonna quote from some of that New York
Times coverage here, mister bankmin Freed and Miss Ellison have
started an unsteady also started an unsteady romantic relationship, with
multiple breakups and reconciliations. At times. Miss Ellison worried that
mister Bankman Freed thought she wasn't good enough when he
was around. She wrote in February twenty twenty two in
a Google document, she had an instinct to shrink and
(02:49:16):
become smaller and quieter and defer to others. After one split,
miss Ellison cut off communication with mister Bankman Freed. I
felt pretty hurt rejected, she wrote in the April twenty
twenty two Google document. Not giving you the contact you
wanted felt like the only way I could regain a
sense of power. Miss Ellison was compensated far less generously
than other top executives at FTX and Alomedia, though it's
(02:49:36):
unclear whether she was aware of it. According to court filings,
the exchanges founders and other key employees received three point
two billion dollars in payouts and loans. Of that total,
six million went to miss Ellison, compared with five hundred
and eighty seven million for mister Singh ftx's head of engineering,
and two hundred and forty six million for mister Wang,
one of the founders. Mister Bankman Freed received two point
(02:49:56):
two billion. SOEs Ellison is definitely not innocent here. She
has admitted guilt in this case, but the reporting makes
it seem as if her main role was to act
as a patsy. Sam knew she was in love with
him and deeply insecure, so he put her in charge
of Alameda so that he could use it as part
of his grift to manipulate the value of his CRISP
(02:50:17):
crypto empire using customer funds. And this basically the six
million he gave her, which is a tiny fraction of
like the three billion they funnel to executives. That's like
him paying her to be a smoke screen. Right, She's
not an equal partner in this enterprise. And one of
the things that had happened right before this fell apart
is he had he had stopped paying attention to her
(02:50:37):
in Alameda in order to start throwing money through another
crypto exchange run from a woman he was fucking now
that he had like, so it seems like this was.
Speaker 2 (02:50:45):
A path for him.
Speaker 3 (02:50:46):
Stop the stop what I'm not doing this. I'm just
talking about what he did.
Speaker 1 (02:50:51):
I just.
Speaker 3 (02:50:53):
It's well, it's bad. He's a bastard. That's why we're
talking about him.
Speaker 2 (02:50:56):
I really I just really need I don't know who
needs to hear this, but we just really need people
to stop fucking Sam Bankman.
Speaker 3 (02:51:03):
That's this is very bad, gross behavior.
Speaker 2 (02:51:09):
It's gross and it's bad for the world.
Speaker 3 (02:51:12):
Yeah yeah, so you know, fuck this guy. One bit
of schadenfreude I can give you all is that, according
to Puck News, Sam's present situation is so unpleasant that
he considers his trips across the country to go to
court in New York the highlight of his life now
because he gets to like go out in the street.
He's surrounded by lawyers in private security, so it's like
he's got an entourage again. He gets to travel. This
(02:51:34):
is like the closest he gets to feeling like when
he used to be a billionaire. So that's kind of fun.
The downside is from the perspective of an FBF hater,
the downside is that recently one of his charges was dismissed,
the campaign finance violation. This was not due to him
being innocent, but due to some legal weirdness involving the
letter of the extradition agreement the US signed with the Bahamas,
(02:51:57):
Basically when we put together the agreements that they'd extra
died him, that was not on the document. So the
FEDS had to like drop the charge in order to
not deal with a bunch of other bullshit. It's a technicality,
but it means that his brother, Gabe and several members
of the philanthropic team at FTX probably will not be
charged for very likely committing crimes. And I say they
(02:52:18):
likely committed crimes because FTX executive Nashad Singh already pled
guilty this spring to participating in a straw donor scheme,
so he is, yeah, and he pled guilty before they
dropped this charge, which he's got to feel like an
asshole for doing because now he is going to get
punished for that, even though SBF is no longer being
charged for it.
Speaker 2 (02:52:39):
Wow, that's I mean. Look, sometimes you do the ethical
altruistic thing and it comes back to bite you in
the ass. What are you to do?
Speaker 3 (02:52:48):
Yeah, what are you going to do?
Speaker 1 (02:52:50):
Hey?
Speaker 3 (02:52:50):
Everyone just wanted to note that since we recorded this,
the prosecution has noted that they will be seeking to
add those charges back on that were dropped. So it's
possible that both Sam and other members of his inner
circle will be charged with all that stuff. We just
really kind of don't know at this point. But I
do want to note that the prosecutions at least saying, hey, like,
(02:53:11):
despite this little mess up, we are not just giving
up on this charge. So heads up about that could
change in the future. Well, Jamie, yeah, how you doing.
Speaker 2 (02:53:26):
I really well, I have a question for you.
Speaker 3 (02:53:29):
I have an answer for you.
Speaker 2 (02:53:31):
Well, I sure fucking hope. So no. My question is
that I'm curious, what do you see happening here? What
feels plausible to you at this time.
Speaker 3 (02:53:42):
You know, I've been seeing a lot of people be like, Oh,
he's going to get off, He's going to get all
he's got too many connections, too many you know, people
who he could roll on. I don't think he has
many people he could really roll on. I don't think
he was like, especially since these finance charges have been dropped.
I don't know that. I really think he he's got
the savvy to have, like a guy like John McAfee.
(02:54:04):
I believe John McAfee killed himself. I don't believe there's
anything shady there. I know a lot about the guy.
It makes sense to me that when his fucking running
finally stopped, he would do that. But McAfee probably did
have some dirt on some people. He was that kind
of cunning, right, I wouldn't be shocked if John McAfee
had put together some dirt on some people, right. I
(02:54:25):
don't think Sam Bankman Freed is that cunning. I don't
think he was like smart enough to have dirt on
anyone who could get him out of this situation. I
think there's a pretty good chance he does hard time.
I think he fucked with too many people, he fucked
with the money, and he fucked with it in too
dumb of a way. So I think he's screwed.
Speaker 2 (02:54:42):
Okay, that was my instinct as well, because I feel
like he doesn't even have I mean, he doesn't have
any sort of like I think sometimes with these types
you get some sort of press narrative that it's like
they're playing four D chess, and like, even if that's
not entirely true, there's like a median narrative that sticks
to them that makes them seem more plausible but I
(02:55:03):
just feel like everything that's like, all of his actions
and all of the media surrounding him, except for a
very very small amount, seems to reinforce the fact that
he's completely incompetent and malicious in every way.
Speaker 3 (02:55:18):
Yeah, yeah, I would say that.
Speaker 2 (02:55:22):
Well, good God, I mean not that I you know,
I don't know. I mean it seems like he's fucked.
I certainly hope he's fucked.
Speaker 3 (02:55:31):
I hope he's fucked. I hate him. I think he's
a gross person. I hope Will mccaskell goes away or
gets like eaten, eaten by a large fish would be
my pick if I got it. If like God is like,
what do you want me to do to Will mccaskal,
I'm like, you, you remember that thing he did with
a whale back in the day. What if he didn't
get out? What if a whale just eats him?
Speaker 4 (02:55:51):
You know?
Speaker 2 (02:55:52):
And then God'd be like, oh, amazing.
Speaker 3 (02:55:53):
I love playing the hits great pitch. You know what, Robert,
I'm gonna give you that HBOC you've been asking for.
Speaker 2 (02:56:01):
Robert, you are an amazing collaborator.
Speaker 3 (02:56:05):
Oh yeah, me and God co creators of my HBO series.
I am hoping if the strike goes on, they take
my reality show pitch super Soaker full of Piss, which
I really think has some potential Jamie, I mean premises.
Speaker 2 (02:56:20):
No, go ahead, tell me the promise of super Soaker
full of piss, Robert. I'm ready.
Speaker 3 (02:56:25):
So I'm in a van. I filled a super soaker
with my piss, and I drive around like Rodeo drive,
and I get out when I see someone who looks famous,
and I score them with a super soaker full of piss,
and then we film it and then I leave very quickly.
Speaker 2 (02:56:40):
Okay, I mean I'm on top with that.
Speaker 3 (02:56:42):
Yeah, I think it's a great I think it's you know,
I will get you know, I don't.
Speaker 2 (02:56:48):
I would be kind of pro like if you could
get I think the soundtrack is going to be really
key there, Like I think if you could give me some.
Speaker 4 (02:56:55):
Like Jock jam situation, like this is exciting all live, all.
Speaker 3 (02:57:01):
Live editions of Blink one eighty two songs because you know,
because they have they are one of the worst live
bands that ever played. So it's really just upsetting to
the to the viewer. That's the goal here.
Speaker 2 (02:57:14):
And you know, given who Blink one eighty two like
rolls with these days, you may in fact run into
Travis on Rodeo Drive. Oh yeah, utial full poet.
Speaker 3 (02:57:23):
Just right in the face, just just a full load
of it, you know, just to say.
Speaker 1 (02:57:29):
I hate this idea because it means that some fucking
celebrity will murder you and then we'll be gone, and
then I'll be sad.
Speaker 3 (02:57:37):
I'm gonna be honest. I'm not great at recognizing celebrities,
so I'm anytime I could just see someone in a
suit and spray them with the piss. Yeah, Jamie, I know,
we can sell so many others. No, we just roll
down the street and stick Jonas Roberts spray, spray, spray,
get the fuck.
Speaker 2 (02:57:55):
Out of here. Wow, No, Jonah is Robert.
Speaker 3 (02:58:00):
That's right, that's right, that's right. I'm going to rely
on Jamie to recognize him though, Oh my.
Speaker 2 (02:58:05):
God, I did see that, Jamie. Yeah, all right. Well,
the Jonas brothers have their own vanity popcorn brand, now,
isn't that something? These are the amazing things I can
teach you, Robert, and.
Speaker 3 (02:58:19):
I've already taught me so much about hot dog through
your best selling book Raw Dog.
Speaker 2 (02:58:24):
Wow, perfect pivot.
Speaker 3 (02:58:26):
You're welcome us welcome.
Speaker 2 (02:58:28):
Spicy plug though gorgeous, gorgeous plug. Hey, it's never too
late to start reading about hot dogs. It's never too late,
never learning.
Speaker 3 (02:58:37):
Reading about hot dogs and also America fascinating story. Raw
Dog find it wherever books are found.
Speaker 2 (02:58:46):
Yeah, well, thank you so much. I truly I was.
I mean, as you know, I did a Bastards episode
about hot dogs as I was writing that book so
that I would remain focused.
Speaker 3 (02:58:59):
Right, It's the best way.
Speaker 2 (02:59:01):
And I have heard that the subject of that episode, George,
say that the hot dog eating community is actively protecting
him from its existence. He does not know it exists,
he doesn't know the book. Everyone in his life is
really actively try like every hot dog eater or many
meters I talked to, We're like, yeah, no, we know
(02:59:24):
about the Bassard's episode and we know about the book,
but we really don't want George to know about it.
Speaker 3 (02:59:28):
I was like, okay, fair enough, beautiful and Jamie, that
made me feel great. And you can sign up for
this show and all other cool Zone shows ad free
at cooler Zone Media. That's for Apple subscribers. We are
working on the Android option. You can find my novel
After the Revolution by typing after the Revolution into whatever
book buying site you use, or just walk into a
(02:59:51):
bookstore and demand it from the manager at sword point anyway, Goodbye,
Goodbye Bye.
Speaker 1 (03:00:01):
Behind the Bastards is a production of cool Zone Media.
For more from cool Zone Media, visit our website Coolzonemedia
dot com or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts. Behind the
Bastards is now available on YouTube, new episodes every Wednesday
and Friday. Subscribe to our channel YouTube dot com slash
(03:00:22):
at Behind the Bastards