All Episodes

June 16, 2025 66 mins

This episode kicks off with a take on marketing — not as a weakness, but as something Tim believes he could excel at, if only he had a product worth pushing. That launches a long-form, semi-improvised dive into the mind of Steve Jobs: not just the public myth, but the obsessive, detail-driven version that cared as much about keynotes as he did about hardware. There's admiration here, but also satire, with a focus on why Jobs' ideas actually landed — and how most people completely miss the point.

From there, the monologue expands into a chaotic but intentional meditation on logic, genius, and cultural mythmaking. Van Gogh becomes a case study in misunderstood brilliance. Jesus is examined as a PR story with missing context. Even Trump shows up, framed as someone who operates (however messily) according to internal logic. It’s loose, fast, and unapologetically nonlinear — but underneath the tangents, there’s a steady argument about what it means to speak truth in public. Watch on YouTube: https://youtu.be/s6NEM2q7Pio

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Hey, this is episode 113 of the 10 Vox Robime Show.

(00:05):
Eeeeeepie!
Now that we got that out of the way, let's get into the episode.
You can see I'm really excited to do this. I'm looking at myself in the monitor.
I do not look very excited to be here today. But when I'm looking into the camera, I can fake it.

(00:26):
I have the... I could fake things if needed.
Be. I could even market myself. People say all the time in show business that they are
bad at marketing. I hate... I don't know how to market myself. That's a part I can't...
I don't know how to market myself. I'd be so famous if I just gave a shit about the

(00:50):
marketing. Not me. I know how to market myself. That's the fun part is being like, "Hey,
I got something... I got something cool to... that's the problem. I don't have anything worth
promoting. So I'm an honest person. But I'm still... if I... the second I had something

(01:11):
to market, I would be good at it. I wouldn't have to use manipulative tactics because it would
just be good marketing and a good product. Not just good marketing." Yeah, I'm not... I'm not...
I'm not good at that. I'm not just good at bullshit marketing. I'm not good at that. I'm

(01:35):
good at the... the honest kind of marketing that Steve Jobs did. He's like, "Okay, this...
everyone else's product is shit." See, this one is not. This one fits a ton of songs on and
it goes on and it can... it can connect it and it... it sinks... it sinks from your computer,

(01:58):
which I also invented too. The Mac, it sinks to your Mac or your Windows piece of shit.
It sinks to your pieces of shit Windows computer even... even just as fast. But it doesn't
have thunder, what... but in every matter of time I had shit that's not relevant for this hypothetical
bullshit I was saying. Okay, you get... I forgot what I was... sorry. I don't know if you get it

(02:26):
because I forgot what I was even in the middle of acting up. I was just in the character of
Steve Jobs, which I... I could do. Not... like a one-to-one reptile occasion. Like I'm not like
an actor who's gonna replicate... no, I'm not a hack. I happen to have the same deep-rooted

(02:51):
personality traits that came from actual, experiential learnings. I actually have the same
experiences under my belt, which you can't see it's why I'm... I don't know why I'm gyrated
in just the shirt. You can see my shirt. So, I have the same experiences under my belt that

(03:16):
lead you to be like Steve Jobs, which is you take a little bit of acid, you listen to the
Beatles and you go like, "What is... what is going on here? What were they... how did they
come up with this shit?" It's like, "Oh, they were just doing the same thing that I'm doing
right here." They were all... they were just curious and that's what happens. You write songs

(03:40):
when you're curious about shit. You write something. Either you write a song or you write software
or you write a business plan or a product plan or a marketing plan or a thing, the keynote.
The keynote presentations were... he probably... I'm sure this is common public knowledge that

(04:04):
he put in just as much effort into the key notes as he did anything else. Any other thing
or facet of his life was all him. He wasn't just half-assing anything that he put his involvement
in, so the key notes were no exception because that's... if he's involved, there's going to be

(04:33):
something... it's going to be Apple. It's going to be an Apple product. So he was... what
the hell? So he's introducing a new product that you've never seen before called the iPod.
Okay, what planet are we on? Okay, so we're already... our minds are fucking blown to the back

(04:55):
of our skulls. Now that he's saying, you can put a thousand songs on your goddamn pocket
in your pocket. So... but beyond that, not only is he showing you this portable thing, soap
of bars, bar of soap, sized object that fits your whole life of music on there. Also, you could

(05:22):
see in real time, there's like a... there's like a feed going to a projector and you could see
in real time what's on the fucking screen of this new... what? How the fuck did they get the
thing wired to do a real time presentation? How did the fucking... so that pisses me off, how

(05:50):
under-looked this stuff is to most people. Most people just go, "Oh, he was just an asshole,
and if you're an asshole, I guess magic, the thing just come out of the fucking sky." He
was just an asshole. That's why he was like, "Now he was a fucking genius who was inspired
by deep-rooted, real things in events and cultural shifts and other geniuses." He wasn't just

(06:15):
a fucking individual loan shark. Now he was influenced by every other genius, so he felt
very connected to what was true. Like Van Gogh, same thing. He wasn't alone when he
saw when he painted all that shit. He was listening to all these other geniuses who were... he didn't

(06:38):
know personally, but knew. Now they would like this. These idiots in my hometown think
is crap. Yeah, wait until I'm dead. There'll be worth 150 mil. He knew that. You have to
trust your instincts if you're a genius. And how do you know if you even have the right
instincts? Yo, no. You just have to not let the external, non-genius part of reality gas

(07:10):
let you, because they could be right. Logic is what trumps everything. There's nothing
above logic. Even Trump is like, "Logic, okay, so let me get out of the way here." Even Trump
in all his bombastic, narcissistic beliefs, they are rooted in having a scorecard that, you

(07:36):
know, that's based on the stats. What stats based on math, numbers, what are numbers based
on? Patterns and logic. You need truth values, ones and like truth. Either this is or is not
true and that takes logic. So a genius doesn't give a crap about something if it's not logical.

(08:03):
If they can't make it logical, why would they have any interest? And logical could mean
poetry. It could mean what I'm talking about Van Gogh. To him it was logical to paint stars
big. It wasn't logical to other people, but he wasn't following their so-called logic. He

(08:27):
was just going by what he actually saw. Experience is where logic comes from. You have to have
the experience of truth. You have to observe the truth. Then it links to this linguistic
concept of logic, of logic in math. That's a linguistic concept, but it's connected. It

(08:51):
reflects physical truths that you could see. If you couldn't see that two and two equals
four, you would just be trusting a poem. You would just be trusting the language in the semantics
would also just be subjective. So what's the difference? They have to be, they have to

(09:12):
line up with rigorous patterns, not just whatever floats your boat, bad. No, rigorous patterns
is in squares, 90 degree angles, or thogonal. No, it's not the same. It's different. You could
label things mathematically and that is what language is for. It's equivalent to language.

(09:41):
There's, there's, you could say, I don't, like math. Okay, I don't want to go down this
rabbit hole of, like, I'm not, it's just off the, what am I talking about here? I'm not
a math guy. Math is not my strong suit. I could talk about things that are not my strong

(10:02):
suit. I guess I'm not, yeah, I'm a little bit tense up about it because I prefer to
kind of be the person giving people someone else's knowledge. I'm not here to learn. I'm
not here to be humbled. You know, I, I don't, I get humbled. I get humbled when I've

(10:24):
right, look at the sun. The sun is going to burn my eyes to, I'll just go blind. If I look
at this, I don't get feel humble around other people. Obviously, it's going to take,
I had to think about it. When do I feel humble? It's not around other people. It's around dead

(10:45):
people that have died a long time ago. Like, okay, if, if Jesus Christ came back to life,
I would be like, okay, he's a little bit better than me. You know, but then the second he
says something stupid, I'd be like, oh, who, who, what am I worshiping this guy for? See,
that's the only reason we worship him is because he's not longer, he's no longer here to say

(11:11):
and bearish and shit that's proven. That's obviously not true. He wasn't all knowing. We
don't know. Was he? He had, he had a few party tricks. That's not all knowing. He could have
got some things wrong. He'd say, oh, I know everything. No, he just did a couple miracles.

(11:31):
Okay. And what if they didn't work out? You didn't hear about the ones that didn't work
out? You know, hear about, okay, I'll watch this and then nothing and he just flopped. Okay.
Okay. Maybe I couldn't do that after all. Don't tell me I won't use that one again, but
check this out. You just distract them with one that actually works. So anyway, back to

(11:52):
the subject at a hand. It was something to Steve Jobs. So Steve Jobs, he, he, don't know
why we started talking about so early in the show, but he's my number one influence at the
moment. I'm not like a simple guy who's just like, like worships the same person over,

(12:16):
like their whole life. Like, I'm not that simple. You can't put me in a box like that. Like,
what box? You could, what box? You just gotta go. He's autistic. Nope. He, he knows about things
that are not just trains. You can't put him in that box. Okay. He's, um, neuro atypical. Yeah.

(12:37):
Whoop, there you go. You got, you pinned it down. You nailed it. Um, so we, I don't know, who cares?
It's like, I'm, why did I get on? So Steve Jobs is, uh, he's my number one influence right now,
because, uh, it's a really good time to be doing what he was doing in the eighties, uh, which was

(13:05):
early eight, like in the nineties, too, actually next when he was fired from Apple or whatever the
hell happened to get him to leave Apple and then he started next and it was like super good technology
in the worldwide web was designed on a next computer. So he was actually, he was actually

(13:26):
building, uh, tools hardware and software that was used by professionals and, and creatives and
that's what Apple was. That's what Apple became, too. Apple still is based on, uh, next step. I just
found that out. But that's, well, fucks, that's a language. Anyway, it's a linguistic thing. It's like,

(13:46):
okay, invent one language. Do you want to ditch the whole thing? No, you keep it. You've had
a backwards compatible Windows does the same thing. Languages kernels, the Linux kernel. Okay,
so that's boring. So why is it a good time to be influenced by Steve Jobs? Well, it's not. It's not,
if you're not already a gifted type of person, like, I don't want to say engineer, he wasn't really

(14:14):
an engineer. He was above that. He was a visionary. You need a vision before you have, you don't need,
you can have an engineer that just throws, uh, shit at a, at a problem without knowing what
they're solving and you'll get results. But okay, so we got this hinge and it's super optimized to,

(14:36):
to go up and down without breaking it's super aerodynamic. It's like, yeah, but what the fuck,
did you build it for? What problem is it to, I don't know, man, I'm just an autistic guy who likes
trains and I went to school to become an engineer. So no, he was above that. Not any disrespect to

(14:58):
engineers. That's like, not what I'm doing. I'm not trying to talk shit about anybody. Specifically,
I'm saying that he was a more holistic specialty. His specialty was, um, I don't vision. His specialty
was technology, problem solving for, salt, like people that actually wanted to make shit. He wanted

(15:24):
to make shit for people that wanted to make shit. So okay, so that involves solving problems and
seeing what the problem, you know, seeing what that solution might look like before exists. He was an
inventor. Uh, yeah, you need engineers to make the inventions work. It's easy to invent shit, by the way.

(15:45):
I, I've invented things. What be where are they? They don't exist just because they, you invented them.
They don't exist, right? You got a pound the hammer down until you, you don't want to do it. By the
time you actually invent a working product, you're not going to want to keep working on it and be,

(16:10):
be happy that you made it. You're going to be able to hate yourself. But okay, I made an app
from scratch. It's ready to launch. What a, like you, you put in a lot of work. Some apps are
reason to make. So I guess I can, I'm only working on something now that's very difficult. But 10 years

(16:31):
ago, or however, five to 10 years, seven years ago, I built an app that wasn't that hard to make.
Wasn't that hard to make? All things considered the one I'm making now, way harder, but it's also a kind
of way more valuable app. So it makes sense. This was just a good idea. Sometimes if you just have a

(16:57):
really good idea and you have the skills to make it happen, it'll only take a couple of weeks. I
think it only took a month to make this app, which immediately was successful. And I immediately
was like, okay, I guess I have to keep actually developing it. I never did, I never actually finished

(17:21):
making the app. Like I, the app existed in a, in a final form, but it was like so,
there was so much to do to add to, to, to, so many features that never came around. I never even made
a website for the app. There was no website. It was just a, it was just a chatbot before even AI

(17:44):
chatbots. It's just a mechanical. So anyway, that's just something I did when I was a teenage or a 25
year old, some ship. I was, I was 28. It matters. You can want to, you want to like, you want to like
say, fake, you want to say, oh, I was only 20. No, I was older. I was like 28. I had to know how to run

(18:06):
dogecoin nodes, you know, full wallet nodes on a Linux server securely and automate it with the,
the Json API or whatever it's called, the RPC API. If you've ever used the, the Json RPC

(18:27):
Bitcoin API, we could hang out. That doesn't, I don't think I've ever met a single person
who's ever even heard. I guess I'm sounding like a douchebag. There's people in just this town
who have definitely used the Bitcoin API. It's just that I wouldn't, I would be like,

(18:50):
intimidated by them. So let's just pretend they don't exist. Let's just pretend I'm the only one.
That Satoshi isn't real, he knows. He was real. He was, he died of ALS. That's what I think, right?
It's probably that, it's probably the health thinning or without other, it was probably health thinning.

(19:14):
Anyway, but he's dead. You can't, you imagine being alive, being Satoshi and just being anonymous,
not spending any of the Bitcoin. So he's obviously dead. So anyway, obviously there's,

(19:34):
there's other people like Vitalik buterin who know how to engineer a, well, he came second,
Vitalik buterin. I don't know, smart, that is brilliant. I still don't understand smart contracts
or liquidity pools. I don't think most people who are using them understand them. They just know,

(19:57):
okay, it's a way to get free money. It's a way to steal a shitload of money very easily.
And there's like, smoke, smoking mirrors and shit with liquidity pools and
governance of scam coins. What do we call them? Pump and dump schemes. What am I talking about? This is,
so, so, so, so, so, so you don't need to be that good at what, you know, you'd have to be a programmer

(20:23):
who is pretty good at programming to do what I did. To also think of it into the design it and launch it,
y'all, you have to also have that whatever that means. So in other words, I wasn't even who I am now.
This was before I was even talking about Steve Jobs. I was a much more humble version of myself like that.

(20:47):
And this was, you know, this was before I knew what I was pure, truly capable of my final
form had not yet been realized. Now I could program the protocol from scratch and the client

(21:07):
and everything else about it, everything about the protocol and the client and the the nodes
if I wanted to, which I don't. Why did I have what I do that? Those people that have done that
and I was like, whoa, those guys must be really smart. Those people that recreate
the blockchain from scratch and they they write their own proof of stake algorithm. And I used to

(21:34):
look at those people and go, wow, they must be really smart. Nope. They just learned, they just went down a
road that pretty much anybody could go down if they choose to go down as a programmer with enough
technical background in object oriented, just programming software development. It's, there's a very,

(21:56):
it's, it's very teachable, right? But anyway, especially if you just copy and somebody else's
fucking idea, that's not that impressive in retrospect. I'm still not, I'm still not shit about
cryptography. So anyway, I could do it. I could learn it though. There's no limits to what I could do
programmatically. And there's where the problem comes in. So now that we got AI programming,

(22:24):
which is better than any human, I don't care if you're Vitalic butoring. If you're somebody that
can write the Linux kernel from scratch, there are some ways that you're going to overlook how,
like these, these things in some ways are going to be like, whoa, you're going to blow your, I don't know,

(22:44):
maybe not, maybe not if you're the guy that made the Linux kernel. But in all other situations,
you're going to be like, okay, this is useful. Maybe I don't know, it's just useful. It just does things
faster. You still have to prompt. But if you prompt it correctly, the amount of work that gets done

(23:10):
after you've just pressed that enter key, it's like, holy crap, that would have taken me so long.
So it would have been so tedious. And so not only that, so the bad part of AI,
first is that it's addictive because you see all this shit happening in over in two seconds.
Like, okay, there you go. I got a terms of service. I got it. That would have had a higher

(23:35):
alloyer. Okay, I got the login system is working. That would have been a huge pain in the dick.
It just working. It just working. After a second of me asking for it to work. The CSS,
that's a pain in the ass sometimes. So anyway, the point, the problem is that it introduces bloat.

(23:57):
And you have to be like, oh, I didn't ask you to do that, but at the same time,
that's kind of a good idea. So now we have to fully implement it and tweak it just right.
So it adds technical debt and just time that you wouldn't normally be spending because you wouldn't
have had the skills to even think of it in the first place. So it gives you what you think is this

(24:20):
higher bar of skills and it does. It really does because if you're willing to learn what it's doing,
you're pretty much at least on some level qualified to explain it and to do it.
If you have the patience, nobody is just, I mean, I'm not talking about shipping a whole

(24:45):
application without knowing anything about it. You have to know everything about it before you just
ship it, but that's what AI is for. So yeah, it'll make something that's like a black box.
Then you ask it and it'll explain what it's doing. So yeah, you still have to read and understand and
learn a lot. But programming is no longer a lot of typing involved on the code front. You don't have

(25:11):
to highlight syntax and be like, oh, I missed a semicolon. All that shit is handled. It's just the
mechanics of the code. It just has to be a good app. That's now the only hard part is now you just
actually have to build something good. There's no longer any pain when it comes to the process itself

(25:36):
of development. Yeah, there's still a pain. There's still a pain in the ass when it comes to it.
Far less than before, you can end up with a prototype overnight and then you'll find out like, oh,
the idea just sucked. That's what was stopping me this whole time. Wasn't my coding skills. I just
wasn't very good at thinking of things that people need in their lives. Maybe one, okay, maybe,

(26:03):
maybe that's why most people aren't poets and most people aren't movie directors or literature
writers. They just consume, yeah, because it's hard to create something original and good.
So Steve Jobs has a quote where he said verbatim doesn't matter if it's verbatim,

(26:30):
that a lot of people like simplicity, but they don't want to put the work in to get it.
So they end up with these complicated messes in their text acts and their products. Their products
become these disgusting pieces of garbage because they're complex and he's like, there is a simple

(26:56):
solution to every problem. There's an elegant solution that solves a whole swath of problem in a
simple way. It's just that most people just don't want to do the work to get there. So they ship
these disconnected things that are just gobbling-gook, gobbling-gook, whatever, most things. They don't

(27:21):
have the cohesive most products or not Apple products, even current Apple products are not that
interesting, yeah, once the last time. Because Steve Jobs is dead. So you have to have that folksy
1960s. It was a time and place for that type of thinking and also you have to realize that the time

(27:48):
and place does not always set the scene. The visionary is the one that gets to set the scene
because we're all visionaries. You just don't know it. The set, the scene is not dictating
your mood and truth. You are playing a role in that as well. So anyway, that's what I was saying is

(28:12):
like I am good at self-promotion. I just don't honestly have what am I promoting? What am I trying
to sell? I don't have any products that I'm ready to release very far from that to be honest. Like
anything if I, when I do am I going to talk about it? No, hell no. I don't have to promote anything

(28:40):
on it. There's other avenues for that that are much more targeted. Can you imagine me just try
to tell random people, "Hey, I have an app." Oh, you're not in the target audience, but hey, you should
know about it anyway because I want you to know. I don't give two fricks of a fuck. If you know

(29:02):
what, that I have some fucking thing, I don't care if you think I'm a loser who just sits in
jerks off the hen tie all day. If you're not in my target audience, you could like,
why are you even, you're not in my target audience? Like, it's like, I don't know how to respond to that.

(29:23):
There's no reason to respond to it. So I'm not going to. It's like a celebrity addressing haters. It's like
you're just going to look bad, right? You're just going to look like you're defending an idea.
You're not, okay, I don't know. If you're a celebrity, they do care about staying on the good graces of

(29:53):
whatever. Okay. Steve Jobs wasn't really that type of person. He wasn't really like a celebrity.
He was historical. He was more important than every celebrity, more influential, like more
substance and I really said important. I mean, how much do you have to, how many more words could

(30:19):
you come up with? He was better than celebrities. That's not the point. Like, he didn't become a celebrity
because of celebrityism. He was a celebrity because he was an inventor that by default,
yeah, we're probably going to know about the guy that invented the thing that we're using to

(30:44):
listen to music. Yeah, we're probably going to eventually hear about that guy, about the guy that
invented the computers that most artistic people use. Yeah, we're probably going to hear about
that guy, not because he wants to be heard from. And other people, like fame just comes, it's just
bestowed upon you. You don't really choose it. It just comes from somebody else

(31:09):
letting the floodgates and like, okay, this right here, it's like, okay, there you go. That wasn't you
that did that, you know, shouldn't really have much to do with your self-worth. Yeah, now it's like,
okay, somebody else made you famous. Like, you had nothing to do with it. Have you know this?

(31:30):
Do you know that nobody made themselves famous? It only happened because somebody,
like some news agency exposed them, like brought a bunch of a huge platform to them. Nobody just
was like, "Raw!" Like, nobody just becomes famous on their own accord. You don't really have much

(31:53):
control over it is in other words. You know, you could aim, you could aim for something that might
get you there, but if you're just aiming for fame, you're, you're just, you're sucking somebody else's
dick. You're just playing somebody else's game and then once you get it, that wasn't your control,

(32:17):
wasn't something you did, it's some other machine that has now collected you into its, into its control.
And a lot of people are going to be assholes to you thinking that you're the thing that they're
talking to, but they're just talking to a carbon, like a, a, a, a, a, what, I know.

(32:44):
So, anyway, but the thing that people desire about fame, which of course, everybody wants
clout, who the fuck does it want clout? That's different than fame. You could have clout without fame.
Clout, everybody wants clout, but then there's the whole thing of clout chasing. So you're just chasing

(33:08):
clout for this, so that's not clout. If you just have clout because of exposure in your, yeah,
anyway, so I'm talking, I sound like I'm, I'm getting, I'm starting to fall asleep.
So let's, let's give back to, so that's the point. That is the intro that I wanted to, to get out of
off the chest is data. I don't have anything to promote in debts. So that's my, it's not like,

(33:38):
I'm one of these guys like, you know, man, if I was just better at, if I was just better at self-promotion,
I would be everywhere. It's like, no, you're, you're not that interesting. I'm pretty sure,
because otherwise, other people would make you famous. They would just feel new. It's free.

(34:02):
It takes, it takes no money. It takes zero dollars to make somebody else famous. Promotion is not the
hard part. That's not the part you're failing at. Yeah, it's, it's just that it hasn't struck a
chord in the right community. But whatever you're doing just hasn't struck a chord. Yeah,

(34:27):
doesn't mean it's not good. It just means it's not mass good. It's not good enough to have a viral effect.
It might be good, but not good enough for people to go, hey, just guy, right? Yeah, you don't want to,
you don't want to force that. You don't want to force the, the cloud thing is like,

(34:54):
if I just made myself famous, they'd never think it'd be good. They'd tell people, okay, if you do
this for me, if you just help me out here, then I'll be set. So you're just chasing, you know,
you're just chasing cloud for the, chase for the sake of fame with the,

(35:21):
yeah, I wouldn't force fame because the faster it comes, the quicker it's going to go and you don't want
to be a meme, you don't. I used to think it would be fun to be a meme, to be like, recognized in a
famous meme that went viral. Yeah, that would get old in about two weeks and then the rest of the next

(35:43):
20 to 30 years, however long, however famous of a meme it is, that's your whole life, your whole life is
just avoiding people who are going to be like, whoa, is that the guy? And there's no currency in that.
There's no positive currency in being a guy from a meme. You don't get anything from it.

(36:09):
People go up to you, they go, okay, can I take my picture and then they leave and you go, okay,
I gave somebody something, what did I get out of that experience? Somebody kind of
used me to their expense in a way that made me feel uncomfortable. That's good, that's it for being

(36:30):
famous from a meme. What else? So if you tell me, sometimes a fame, you don't want the kind that just comes,
you want the kind that's like gradual and only had, based on a core customer base,
because you're actually providing a direct service to those people.
And then it's not about the fame, it's about the connection with those people, it's about the service,

(36:56):
now it's not like a fucking hippie, now it sound like a freaking school girl, now it's
like a guy that volunteers at a church, which doesn't exist anymore, who doesn't exist?
No men, no men volunteer at churches, feet, women don't either, except for the women that still

(37:23):
go to church. The same ones that go to church also volunteer at church. Yeah, what a coincidence.
Men don't volunteer in general, so the odds of them volunteering at a church, pretty low,
but it's getting higher. Now that church is becoming cool again. Anyway, I was talking about,

(37:50):
yeah, promotion, you need something good to promote. The second I had something good to promote,
I would promote it. We live in a world of marketing, that's all anything is as marketing for almost,
not really. YouTube is not just marketing. When you consume content on YouTube, that's not marketing,

(38:15):
that's giving you information, it's a consumption of some kind, but it's not marketing. So I am working on
a new app service, and it's not even close to being ready to sell. That's why I hate the
Silicon Valley bullshit. It's all about getting funding before you even have a working

(38:45):
fucking product, and it's like, okay, we got a prototype, it's like, okay, it sucks.
Okay, so how much more millions of dollars before it doesn't suck? Couple hundred mill. Okay,
you got it. Still sucks. Okay, they don't get a hundred mill, but like a couple mill. I don't know
how much like like 10 million is not that much for an app to get in like seed funding, or it's still

(39:09):
a lot, like because there's so many apps that are all the same garbage. It's like, okay, so this,
it finds events in your city. It's like, yeah, there's a million of those. Like, no, no, no, no,
this one has a slick website. No, yeah, they all have slick websites. EventRite has a good, no, no, no,

(39:30):
this one has, this one has a better, this one was programmed in Python with a nice react framework.
Yeah, that's not going to do shit. The idea has already been done.
So I have an idea that is good hasn't been done. And so here's the catch. Okay, so you have a good idea.

(39:59):
It hasn't been done. And you have the skills to do it. Yeah. Okay, then what's the catch? The time,
the time that it takes to create a high quality app, not polished. I'm not talking polished. I'm
just talking about the MVP. Yeah, I'm not even close to done with that. It's not even close to good.

(40:22):
Every day it sucks a little bit less. Every day I go, okay, I made it suck. Just a little less
in this one area. The whole thing still sucks, but this one feature is now kind of making sense

(40:44):
and I zoom out and like, okay, the feature that contains this feature that I fixed
still sucks. So the whole app, why would I even talk about it? The feature that even makes it
worthwhile isn't even good. So why would I even talk about the app? And I spent a whole month just on

(41:09):
the infrastructure for the app, not on the algorithm. I needed to work on the infrastructure first.
Now that I've got the infrastructure, I got the data coming in. It did help to have the infrastructure
because now I got the data coming in for all of the optionable, it's a finance app. I guess I

(41:36):
guess I should probably say what the niche is. It's in the fine ash niche. So no big deal there.
It's no problem, right? That's an easy niche. And it's to find you opportunities for trading
options. Not stocks, no, that's too easy. Derivatives. An abstraction on an abstraction.

(42:01):
Stocks are an abstraction, right? That's like a fake stake in a company that we just say pretend,
we just pretend that this person owns one person. Sure, that's already pretty abstract.
Now we layer on top of it, options contracts. Okay, but what if I want to sell? Maybe I want to buy

(42:24):
these fake, these shares in a company that I own. Technically you do own the company. Okay, so
okay, still abstract. Everything is just a way of saying this is that. Everything's just a metaphor
for something else. Okay, now that we've got all the philosophies summed up at one sentence,

(42:48):
now we can talk about finance again. So I'm doing a product called Options Aware. I'm aware.
That's the domain name that was available. Optionsaware.com. Or option aware if you don't want to use
the yes, I got that one too. So options aware is a automated, set and forget system and bot

(43:18):
that tells you, hey, wait up after filling out a form. That's a fan, like filling out a form. It's
much more high tech than that. It's a wizard. You fill out a form through a wizard, a step by step
wizard. It collects these personality traits about you. It's not data, it's data. It collects data

(43:41):
on you. And then translates these simple English questions into numbers that you will not understand
that tell the algorithm. Okay, this is what this guy wants. This is how he likes his traits. So then,
once it knows your trading personality for the given strategy, which is pre-labelled as cash

(44:08):
secured puts is the first one that I'm doing. So it tells you every day, okay, we know which types
of strategy preferences you have for cash secured puts. So he got bomb. Every day, we're going to
tell you here's their best, best traits open right now. And this is extremely involved and complex

(44:30):
because there's a million ways of going about this. You got to draw the line somewhere.
Like, okay, if this, if you have a high risk tolerance, then we'll give you this. You know,
it sets the parameters based on your these simple English questions that you ask. Like, okay,
what's your, how comfortable are you being assigned? For example, I don't, not comfortable. Okay,

(44:54):
what's your risk tolerance? You want even aggressive, very aggressive, conservative,
very conservative. So you ask you these simple questions. It does all the hard shit in the background.
It does all the math instantly. It looks up thousands of tickers and then dozens of options contracts

(45:17):
on each ticker. So we're talking hundreds of thousands of contracts have to be tracked by this
fucking system. The reason I'm curs is because I'm actually working a blue collar job right now.
You think this is white collar? Do you know what thousands of symbols floating around on a
fucking computer? That's blue collar. I think I'm having fun. I think it's roses. I think I'm just

(45:44):
drinking peanut colliders while doing this shit. So it's still not even it's, the data is the
hard part. The algorithm is also the hard part. But to get to the algorithm, you need the input data
coming in every second. Within reason, you need to be pulling real-time prices for every single

(46:09):
option contract on the market. And then you fiddle it down by delta, by premium, by other metrics,
percentage of profit. And then you come up with you wouldalit down. And so the data is a nightmare.

(46:29):
I mean, that's not, it's a lot. You need to track thousands of symbols and each symbol
dozens if not hundreds of contracts. They all have a different price. Every contract has a different
price. They have to track. So anyway, this is ways to optimize for this. This is ways to cash

(46:51):
the prices and there's ways to only repull the prices when necessary. So that's part of the
development and optimization part. But that's also part of the core algorithm itself.
Because it has to be fast, not just accurate. We could get accuracy, but does that mean it's fast?

(47:11):
You want both. This is not going to be a high quality product until it could do both. Real-time,
finding opportunities. And that's be good. So that's what I've been doing for the past month and a
half besides other projects, which has just been the neuroscience one, which I already talked about.

(47:34):
So the other thing I'm doing is the science and it's really, it's neuroscience. It's not just
philosophy where I'm just talking shit out. I'm looking into, I'm actually doing data
crunching and fmri analysis. So I'm downloading terabytes worth of data of raw fmri data, some of

(48:00):
its pre-processed. So it's pre-processed, so it's not raw. But you want it to be pre-processed.
It would be either them or me pre-processing the data. So you get the data cleaned up,
fmri brands. And it's cleaned up so it's some, everything's normalized. And it removes the

(48:22):
cerebrose spinal fluid and removes anything that you don't want. So you can see just the gray and white
matter. And you can do a lot with just modern high-tech expensive computers, not low-end. You still need
a very good, you need a Mac. So if you have a modern M4 Macintosh computer, they're not called Macintosh.

(48:48):
I have a Mac mini, they're all, they're called Macs. I have a Mac mini right there. And it's, it
processes fmri data using customs software that I wrote in Python, Python. I don't write everything.
I'm using libraries to analyze the brains. And it tells you, okay, here's all the parts of the brain

(49:13):
that were active during this part of the stimulus in the experiment, whatever in the data set.
So I know it pretty well. Like I actually scrubbed through the actual data, like, okay, so I got this
fucking data set of 184 subjects, people's brains, each brain is a different subject, different person

(49:39):
doing a different test. And they all did the same type of tests. It's four movies, four clips,
15 long minute video clips that they were exposed to and those are rest, it starts with the rest.
But even during the rest, their brains are all sort doing all sorts of shit.
So I tried, I looked at this, I did it and I did a meta analysis where we pulled data.

(50:05):
And then we summarized the data of the whole population and then the individual brains, we looked at
what's going on with like which parts of their brain light up at different times of the movie.
And we go pretty granular every second of the movie, we know exactly what's happening in the brain and

(50:26):
the regions of interest. And it's very accurate to the millimeter of which area in the brain is
doing is lighting up and how active it is. And I even made a website that loads the movie,
shows you the movie on one side and shows you the brain diagram and it shows you in real time as

(50:50):
the movie is playing, which areas are active. I made this from scratch because it's, why not? It's
cool. I wanted to actually see what the hell is going on with the brain in the way it syncs up with
the movie. It actually does. Like you could scrub through the movie and you could just see these

(51:13):
changes in the brain. Kind of like it's random. There's a pattern to it. You could see at certain times
of the movie the content changes along with the brain. You can't tell what exactly, you can't see
a correlation, but you could tell that there's a pattern just because you could just scrub through

(51:36):
the movie and you'll see, okay, the brain is clearly changing in relation to the content.
It just, you could just see it with your own eyes. So that demonstrated that the analysis was working
because I could see it with my own eyes. I could see the data moving. I could see the regions of

(52:01):
interest getting bigger and smaller visually. So that proved that this was an empirical
extraction. It wasn't just like, hey, maybe, I don't know, maybe this is working on, no, no, no.
No, I could actually see it with my own two eyes that the content matched. And we also did more than

(52:22):
just, and then we did, oh, Jesus Christ, it's out. Then we used AI to get embeddings from the video.
Yeah, I don't really want to talk about the AI purpose. That's what kind of way you have to do
to actually get the regions of interest in the similarity scores called region similarity analysis

(52:45):
is the method I was using to see which areas process concept, similarity, models, modalities,
similarly. Anyway, I haven't been working out for a couple of days. You get a lot done in a
couple of days if you have, if you're addicted to AI. So the meta analysis, the population report

(53:11):
analyzed combined in aggregates of 100 brains with of data. And it started to coalesce
showing like which parts of the brain handle abstract concepts in which handle more concrete,
visual stuff. It seems that vision is the most concrete part, concrete modality in terms of

(53:36):
the brain. Any other modality, it's a crapshoot. Concreteness is just visual. Auditory is not
touch, motor, it's metaphorical. It's more abstract. Vision is like mathematical. We could predict how

(53:57):
the brain activates. We could predict the brain's activity pretty well with visual stimuli. We could show
an image to a brain and be like, it's probably going to do this. But we can't do that with things like
language and other things like touch and motor. We can't predict is concrete. I didn't touch,

(54:23):
I didn't test with motors. I don't know the fuck I'm talking about motor. I think we can actually
maybe, maybe that's, that'd be weird to just make an assumption without knowing for sure. I think the
reason is not, it's not an assumption. It's because I believe the image brain devices are brain,
whatever the fucking call brain computers, the things that connect your brain. I think they do

(54:47):
predict modal, I mean, motor signals with high accuracy through simple machine learning models.
So it's not just vision. It might be motor as well. But, but language is abstract. It's fuck,
we don't know how the whole brain processes language. It's different for everybody. It's, it's,

(55:09):
it's on different sides of the skull yet they both do different things regardless of which skull.
It's lateralized, not lateralized. If you're right brained, you're going to be using most of the right
brain to process semantics. But, doesn't mean you're only using your right brain. You still have to
use your left brain for some things, for sequential things. The right brain is better at holistic

(55:35):
meaning making. The left is better sequential, the parts, whatever you call that. I don't know.
Not holistic. Anyway, but even left brain people, the people that are strongly lateralized,
lateralized to the left, whatever the fucking mean. Some people just left brained. That's true.

(55:59):
What does it mean when you say people are left? Well, it's really what they, they say language. It
means your language happens most handling. Even so, even if you're left brained, it's still not
exclusively all happening on the left. The brain, the right hemisphere is still going to be used

(56:21):
for semantics that have to do with the more abstract, the more, how you would say non-linguistic,
even though like vision, holistic metaphor, but the left side is more rigorous concrete. Anyway,
but it's still abstract because language is abstract. Nothing, I mean, okay, things can be read,

(56:47):
things can be purely linguistic like a word, right? Right? No, that's not true. There's no such thing as
a purely linguistic signal. Everything is a language to the brain. So that's the premise that led me
to become a researcher. I figured out that everything is linguistic. At the end of the day, it has to

(57:10):
be converted to some kind of meaningful semantic concept, right? So that does involve the language
centers, whether it's visual in the beginning, it's less to be converted to a linguistic concept,
or non-linguistic concept. What's the difference? That's the point. There isn't a difference between a

(57:33):
vision concept and a non-vision concept or a linguistic or a non-linguistic concept. All concepts
are linguistic by definition because you need linguistics to have semantics. You can't have meaning
without words encapsulating the meaning. Okay, that's a very introductory elevator pitch

(58:00):
thing of what I'm doing. I discovered that on my own accord. I'm a true original in that,
but nobody has discovered this yet. They haven't published it yet. I'm not saying I'm the first to
figure this out. It's kind of obvious, I think. I think people will kind of start

(58:25):
bringing the bell pretty soon just because it's in the name. Large language models.
What is the word that's in there that you can't remove? Yeah. Languages are pretty much where intelligence
comes from. You can't have intelligence without language. So is it a coincidence that large language

(58:49):
models are the first thing being called artificial intelligence? No, because language isn't
intelligence. So I'm not a genius for cracking this simple puzzle. Yeah, okay, I feel like a
chicken fucking genius compared to most people, but I am able to look outside my immediate surroundings.
I know that there's other people out there that are actually pretty smart. Smarter than me. I haven't

(59:16):
met anybody yet. In a while, I'll be like, "Oh, look, I'm smarter than me." But yeah, I'm sure there is
at least one who was like, "smarter than me." But that's it. How would you measure that?
There is. You could measure it. Yeah. And the more you learn, the more you find out, you know,

(59:38):
no shit. Like just because you're cracked a puzzle, there's more puzzles to solve.
Okay, we die yet. This is done. It's not way. I was a fucking hell. Okay.
If you don't count the false starts of the intro, it's only about an hour, so I'll give you another

(01:00:01):
minute. So what else has been going on? So this app, the options are where thing, huge pain in the ass.
I'm not complaining. This is just, okay, I'm sure I'm complaining, but that's part of the job.
I'm also getting shit dark. And I really get anything from complaining. It doesn't really make me feel

(01:00:25):
like I accomplished anything. I just talk about how hard it is. It doesn't really feel like,
like, I'm accomplishing anything. I think that's why I complain about it. It doesn't feel like I'm
accomplishing shit. You could make a custom piece of software that shows you real-time M-fimmer,

(01:00:49):
like results, like, whatever. You could do something that's like very, like, cool to you. But if nobody
else tells me, I'm not going to think it's that cool for more than like a day. That's how Steve Jobs
was. He wasn't talking about shitty, like, 20 years ago. I was like, okay, but the Apple too,

(01:01:12):
that was when it was shit was the best. Now, he was talking about shit that was still to come.
There was always, it's always something cooler out there. And that brings me back to the whole
narcissist, something. You could be a narcissist, but if you're wrong, that's not going to look very

(01:01:34):
good. The reason narcissists end up in high positions is because as much as, yeah, it might hate them,
they're getting something's correct. But if you don't admit defeat when you're logically wrong,
like, stocked in rush, who turns out just was a bad evil person. He's just an evil asshole.

(01:02:02):
And was evil or just dumb? Well, if you kind of know that you're doing it,
that's the thing. Where are you now? Can you assign evil to a narcissist that is only doing it out of
sickness? If you don't have the self-awareness and the intelligence to take responsibility to see it,

(01:02:28):
you have to at least experience the evil for yourself. You have to have some to some
awareness of the morality. I don't know. To be evil, you kind of have to be okay with being evil.

(01:02:52):
You have to kind of be like, okay, I guess I'm evil. If you could prove that, that stocked in rush was
that, then yeah, he's, but he seemed like just a narcissist and that's different than evil.
They're not good enough to be evil. That's not an insult. This is an insult.

(01:03:21):
They would take it as, as like, you mean you're calling us even? No, you're not. They want to be evil.
Narcissists would like to be called evil, but they're not, they haven't done anything to earn that
title. They're just sick. It's a form of unintelligence. It's a form of retardation to be a narcissist.

(01:03:47):
You don't look at it as evil. It's like, no, you have to work up to evil. You have to kind of be smart to
be evil. Evil takes awareness. Can't just be ignorant and evil, right? Hitler is called evil because
he kind of knew what he was doing, right? He wasn't just kind of sick. He was also evil.

(01:04:14):
Let's not get into that debate, but narcissism isn't a form of intelligence. It's not like a positive
trait. It's a wound that is festering in response to something being broken. It's two things not

(01:04:36):
meeting and something forms in the middle, which is rancid and diseased and unsatisfied and it's
unsistaining. That's what it diseases. Diseases don't end up towards a goal of happiness. No, they end
up towards destruction and okay, the disease got what it wanted and now what happened? How it's gone?

(01:05:00):
Everything's gone. Killed the host. That's what narcissism. It kills the host. You don't want to be
a narcissist. It just hurts you. It puts you in survival mode and yeah, that's not like a thing to

(01:05:23):
be. That means you are lacking something. Anyway, this has been episode 113 of the 10-pharish
rhubarb. I'm sure I'll show you probably in a couple of weeks because I'm not, you know,
I'm not going to really want to do this. I'm not going to want to do this again next week. Maybe,

(01:05:47):
hey, what could I say? I'm doing things that are more fruitful and important to me than this is to you.
Sorry. Let me rephrase that. This is not as important to me as it is to you.

(01:06:08):
[BLANK_AUDIO]
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.