Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is the Andresgovia Show.
Speaker 2 (00:04):
Chat and welcome to the On Dressinggovia Show once again.
It's good to have you on.
Speaker 1 (00:09):
No, yeah, I always always enjoy our conversations and getting
the chat and I'm glad, We're glad we can do
this again. Yeah.
Speaker 2 (00:14):
Absolutely, And it's I guess because I mentioned right there
as a CEO of Pickock, I want to kick off
by starting by Pickocks because last time we talked it
was there was a little different because it's still in
the pre launch phase and as of we record this,
I think we're getting close to the the big launch
and the app launch too.
Speaker 1 (00:34):
Right, Yeah, yeah, yeah, So we've got the website's pretty
much you know, it's rocking and rolling as is. People
can sign up and people have been which has been awesome,
kind of building a really cool community. And then we
are weeks away from lunch in the app, so we're
we're beta testing right now. Just we've been onboarding our
Pickaxe Ambassador team and then next we're going to be
onboarding are our creators and users. And I'm gonna be
(00:57):
looking at the referra. So we have a referral program
that's on Pickaxe, and so you guys can use like
a tracking link to promote you know, pickaxe, and so
whoever's at the top of the leader board will get
the first access to to the beta app. So that's
kind of how we're how we're how we're doing it.
Speaker 2 (01:12):
Yeah, and when we last time I said about pick axe,
I felt it was a LinkedIn but for medical journalists
and anyone that's that's probably been more censored often than not,
but free to speak on pickax. And since the influx
of a lot of rumble creators, particularly in the gaming
community that jumped over, you know, it's just it's made
(01:34):
it made it so much more lively and I think
that was a great, you know, injection of energy that
we needed. So there's a lot of excitement buzzing around
Pickacks that I noticed.
Speaker 1 (01:44):
Yeah, Yeah, it's it's been it's been a lot of fun.
And I think that the getting getting so many gamers
to come over to the platform has been a game
changer for us. You know, no pun intended, but it's
but but because like it makes for interesting conversations, because
you know, uh, you know, the the big reason why
a lot of the gamers have come over is because
we have the Rumble integration, So if you post a
(02:05):
Rumble video on the platform, it's playable in the news feed.
If you're a creator, you can actually connect you Rumble
channel to your Pickaxe account and it'll automatically post a
Rumble video to to pick Acts when you win it
whenever you go live in real time, so you don't
have to copy and paste and go over here and
post the show and all that. It should just be
in real time. So so it's a really cool way
to do it. And because of that, a lot of
the gamers are like, we have no social media platform
(02:28):
where we can promote our Rumble videos because you get
demoted in the algorithms, so if you post an outbound link,
and so they're like, finally, if we can get our
audience to come over to pick Acts, not only can
we not only can we promote our videos and we
don't get dinged, but it's playable right there in the
news feed. And there's actually an account that's on Pickax
and they're on you know, Rumble and all that kind
(02:48):
of stuff, and they do kind of a little bit
more like training and tips and all that kind of
stuff for Rumble mastering Rumble, and they did like a
test where they posted one of one of their like
old videos and they just they posted it on too
pick Acts and then and then they looked at the
analytics and they got they got six hundred new monetized
views on on Rumble just from posting it on Pickax.
(03:09):
So so for them, they're looking at it like, okay, cool.
So now not only can we get some more views
and some more viewership, but we can also get the
monetized views on Rumble. And technically speaking, when we roll
out our new uh you know, modernization system and revenue
share and all that kind of stuff, if you're on Rumble,
you can kind of double dip because you're you're earning
the revenue through the advertising on Rumble. You can earn
through through the advertising on pick Acts if you get
(03:31):
everybody to watch your Rumble video on pick Axe, So
you're kind of able to double dip in that way.
So you know, we're getting a lot of people coming
over just specifically for those Rumble videos.
Speaker 2 (03:40):
Yeah, and it's really beneficial to those, especially smaller accounts,
because it's just so difficult. People were asking even this week,
sould I get an X account. It's like, well, if
you're a creator and trying to create an audience that's
X not the place to be because good luck being noticed.
Even if you are pain like you said, as the algorithm,
and every time you post something that's not linked to X,
(04:04):
it's just not going to get them any place unless
there's a larger account that does maybe repost you and
notices you and carries it over. But other than that,
it's not reaching anybody. And it's as far as I know,
Pickax does not have an algorithm.
Speaker 1 (04:18):
Yeah, yeah, so so so we are quote unquote algorithm
is literally it's a chronological feed. So in your news feed,
so you control, you control your news feed. I'm not
getting in the middle of it. I'm actually thinking about
maybe I'm gonna make shirts that say I am the algorithm,
and that's gonna be like our tagline I love because
like you, because literally you decide. So so you you know,
(04:39):
for your own news feed, you control it because obviously
anybody you follow, what show up in your news feeds chronological,
and then as people interact or engage with that post,
it'll get bumped back up to the top and then
chronologically go down somebody engages with it, it gets bumped back
up to the top because everybod's kind of hopping in
at different times. That way, you're either going to see
the most recent post or whatever's getting engagement at that
moment at that time. So it's a it's a way
(05:02):
that I don't have to plan out and create an algorithm.
It tries to figure out and mind reads you and
figures out what you want to see. It's like, no,
you control it completely by yourself. And then as well,
you also control what trends because literally the trending is
just what's most popular at that time, what's getting the
most engagement, what's doing all that, whether it's a post,
a video from rumble, or a or an article, is like,
(05:24):
you control it. The algorithms don't decide what we think
people want to see because I could care less about that.
You guys decide what you actually want to see by
and by engaging and promoting and all that kind of stuff.
So it's got it's kind of taken it back to
the og MySpace days, which is kind of fun. Yeah,
and I'm I'm working on throwing it in a few
little MySpace Easter eggs in there too, So.
Speaker 2 (05:45):
Yeah, I never got into the social media space. Early on,
I was well aware in MySpace. I'm like, man, I
got no time for that. And Facebook when I was
I don't remember what I was telling. I remember when
Facebook arrived to my campus and people don't understand that
because I'm like, well, watch the social network and you
understand that Facebook was rolled out. It wasn't just then that, oh,
(06:05):
there's a website, come on in. No, it was rolled
out to my campus. So then our classes started using
it for group projects. I didn't create it, someone in
my team did. I'm like, what's this. That's what we're
gonna be tracking our thing. The instructor wanted us to
do it this way and reported that, okay, fine, and
it was a sign to me. I deleted it after
I was done with it, But eventually, by that time
(06:27):
I had created the network that then felt I needed
to recreate a Facebook, especially with the way Facebook worked
that if you have business pages, which is different from
a personal page, those would be automatically deleted if there's
no personal page attached to that, and so I had
to create one for myself under an alias in order
for me to keep those pages alive. But then the
(06:49):
worst part was that even with an alias. Facebook would
tell everybody said, hey, you're so and so that you
know is over here, And all I had to do
was probably look at obscure picture of me and realized, hey,
look it's so and so. And then I remember facial
recognition being rolled out and everything was its history. But
it's just that was then and this is now. So
I've been seeing a lot more people bringing up MySpace
(07:11):
lately because like we don't know how good we had
it kind of thing, because you had more control as
far as I know, people said they had more control
over their pages than in this case, Facebook controlling it
for you with the ease of use. So you're throwing
that out. It's like, I've been hearing a lot more
of this MySpace So I guess I missed out on
that train.
Speaker 1 (07:27):
Well. I mean, when you think about it, you go
back to the OG days, whether you're talking about Twitter
or talking about Facebook or MySpace, it was actually a
social network, like you were legitimately like you were able
to build community. I remember, like, like I remember it
was like twenty fourteen, twenty fifteen, I hopped over onto
Twitter and I started Twitter account and started getting into like
theological debates and all this kind of stuff, But you
(07:48):
would see the same people all the time because that's
who you were following. And so it was really interesting
because like you were able to build a community around
like a particular belief or a particular topic or whatever
it was, and you would continually see the same people.
Now it's like everybody's seeing the same handful of accounts
that's being promoted by the algorithms, because the algorithms like
those accounts, because those accounts know how to rage, post
(08:10):
and get clicks and you know, make money on the
platform and all that. And it's it's like, why why
can't we just go back to let's actually create a
community like like and this this is what I've always
felt as well. It's like I would rather have five hundred,
you know, diehard supporters in a community where we're actually
engaging with each other and we're fully engaged in like
ten thousand and fifty thousand, even like one hundred thousand
(08:32):
of unengaged followers. But a lot of people they chase
those numbers and they don't realize they're being manipulated by
the algorithms to where those numbers really when you think
about it, if you're on ext or Instagram or whatever, like,
those numbers don't actually mean anything because because the algorithms
don't really look at how many followers you have. They
look at the kind of content that you're posting, the
topics that you're discussing, you know, do you have a
(08:53):
reputable name, all that kind of stuff, and then then
they decide does would people want to engage with this
type of a post or not. So it's it's all
about like you're now making content for the algorithms as
opposed to you for your community or just sharing your
opinion or anything like that. So I think that that's
why there's so many people now that are like, you know,
having having fomo for I wish we could have MySpace
(09:16):
again because because because it's like, you know, there is
nothing out there really that that truly allows you to
build a true community anymore, except maybe like locals and
a couple and a couple of platforms where it's like,
you know, a paywall type situation and everybody's tuning in
and all that kind of stuff. But we need to
kind of bring back that social element of social media
(09:37):
that I think is sorely lacking.
Speaker 2 (09:39):
Yeah, and I definitely agree with you. And one of
the things that we touched on, and that's how it's
all building up because the last time I had you on,
you had brought up this whole thing about artificial intelligence
influencers in social media. So it's like Instagram, Facebook, and
asked why would they even do that? Why would these
companies do that? And you talk about, well, it will
(10:00):
keep the revenue in as opposed to go into some
other influencer, if you will. And since then, what we
have seen on X, for example, are these weird AI
accounts that are engaging with some of the main posters.
You've seen this, right, I don't mean just Grock, but
like actually engaging.
Speaker 1 (10:22):
Yeah. Well, well, and the thing is that because they
know because the way that the monetization system is set
up in on X is that it's based upon replies,
and so the ads that are shown on replies to posts,
it's not it's not actually about whether your post goes
viral or not. It's about it's about the replying and
the engagement and all and all that kind of stuff.
So then they use bots in order to basically artificially
(10:42):
create the conversations and and and in all reality, it's like,
you know, if you if you're using chet GPT, you
can easily train it to be you know, essentially sound
and talk like you and all that. And then you know,
if if you just connect it to like automatically reading
what's out there and then it and then it replies
like it, it's not that complicated of a thing to
gain the system. And and that's again another problem with
(11:05):
these platforms and I am playing this algorithm game, is
that there'll be a short period of time right after
they change the algorithms where everybody's trying to figure out
what's the algorithm, how how do I get seen? How
do I do all that kind of stuff, And then
once they figure it out, then you have you have
these people they gain the system with with bots and
AI and all that kind of stuff, and then they
(11:26):
abuse the system and then all of a sudden, the
platform has to change the algorithm again, and so the
only and then it's screwing the real humans that aren't
just sitting there figuring out how do how do I
gain the system? They're just like, I want to put
out good content, and they're getting screwed because they're not
playing the algorithm game. But these AI trollers do. But
you know who wins in that scenario. It's the people.
(11:47):
It's the people with the bots, because they gain the
system to their own benefit, and they're the ones earning
the revenue, not the people that should be earning the revenue,
which is the human beings that are actually creating solid,
legitimate content. But they're not going to sit there and
post five hundred times a day with the bot exactly exactly.
Speaker 2 (12:03):
Yeah, And that's what I found so concerning because I'll
tell you who messaged me, but off the air. So
she messages me is like, do you think there's a
there's a bot problem on X? Like? Very much. So
everybody you've been arguing against has been a bot. Don't
don't give them the time of day. So I And
(12:23):
here's the worst part of it, though, there are legitimate
people that are responding, and the way X works, unless
they're verified, and oftentimes not even the verified ones, I
don't get the notifications, but the ones that are not verified,
those notifications don't come through and some of them are
marked as possible spam. So there's been legit people that
have been responding to some of my posts, and I'm
seeing them way later because there's only so many notifications
(12:45):
you can track. I'm across sixteen different social media platforms,
about a dozen video platforms, and you know, I water
my garden. I take care of what I take care
But there are those that that I where I have
the most engagement, which be YouTube, rumble, Instagram, So of
course that's where I spend a lot of my time on.
But with all that being said, it's just becoming a
(13:06):
concern as to I'm now I'm getting people dming me
or texting me or through a social media app like
instant messaging app messaging say hey, is this real or
is this AI? And I'm like, oh, this this is AI,
that's AI, that's very much AI. And people are not
noticing this kind of stuff. It's pretty it's pretty scary
how easily manipulated people can be because they're just doom
(13:29):
scrolling and in that moment, if they don't catch the artifacts,
they't just think it's real, and that is is damaging
the psychosis of a lot of people that were not
It was already difficult enough to try to identify reality
from fiction when it comes to news. Now it's like,
well there's video evidence, yeah, but the video is fake.
(13:50):
So it's like to separate all that holy smokes. One
thing that that we can that I can say that
pickcacs doesn't have too. I have not seen bought engagements.
I have not seen I haven't even seen parody accounts.
I've always seen reserved handles for big influencers in case
they want them. That's that's all the legit I see
(14:10):
other people do it. But other than that, I haven't
seen like fake accounts like trying to build engagement for
themselves on pick X.
Speaker 1 (14:18):
Yeah. Well so there there's there's been a few that
are that have that have popped up and they all
try to get verified and and we're like, sorry, we're not.
We're not We're not verifying these types of accounts. So
so what what what we do is we we will
verify you if your name matches your ID or if
you have let's say a you know, a brand or
(14:39):
a company obviously will verify you. We just verify the
domain and like all that kind of stuff. Yeah. Uh,
and then and then the other one would be if
you have like a a brand or like pseudonym or
like let's let's say you're like a gamer, right, if
you're a gamer. Very few gamers actually stream with it
with their own name. They all have some sort of
like handle, right, yeah, but it's like if they come
over to Pickax, you're gonna want to know, am I
(15:01):
actually looking at the real this real the real gamer
or not? Because if if we force them to do
their name and not their gamer idea or their gamer handle,
nobody would actually know who they are. So so that's
why we so we kind of do it that way.
But it's like we're not gonna be doing We're not
gonna be verifying parity accounts. We're not gonna be verifying
you know accounts where it's like like I had had
one person that tried to you know, you know, say that,
(15:23):
oh yeah, my pseudonym is is a character from this
TV show. It's like, well sorry, unless unless you're the
studio that controls that that show, Like that's not your
property for you to get verified. So we're very we're
very particular on this, and and a big reason is
because we want human to human engagement, and we want
you to know that when you see a blue check
(15:44):
mark on Pickaxe, that it's that it's actually legitimately that person,
you know, It's like it's like if if cat turd
came over, I can promise you cat turd is not
on his is not on his driver's license. But you're
gonna want to know that you're actually talking to the
cat turd if they're on there, as opposed to a knockoff.
And so that's kind of why why we do it
the way that we do. But it also allows us
(16:06):
to weed out the bots and the anonymous trolls and
all that kind of stuff to where it's it's a
very human to human engagement system. So like if you're
if you like, you could theoretically, I guess if you
wanted to set up like a bod account on the platform,
but you wouldn't be able to get verified. And so
so if you if you do it that way, you're
trying to gain the system, Well, guess what, your body
(16:26):
account is never going to trend, it's never gonna be recommended.
You could maybe, you know, theoretically, theoretically be commenting on
a lot of people's posts and all that kind of stuff,
But one of the next things we're actually doing is
we're gonna be weeding out unverified you know, comments to
where they still be there, but it'll be shown. You'll
have to click you know more to see the unverified
ones because we have free verification on the platform. So
if you're a real human being, you get verified, you're
(16:48):
you're gonna be showing up. It's gonna be very difficult,
if not impossible, if you have an unverified account to
to be able to game the system and manipulate it.
You'll never be seen, You'll never be trend unless literally
somebody go to intentionally search for you. And and this
is the only way that I can see where we
can have a truly human centered social media platform without
(17:09):
you know, intelligence agencies using bots in order to push propaganda,
without people gaming the system and doing all that kind
of stuff. It's like, let's just bring back like I
want to talk to you because I know you're a
human being.
Speaker 2 (17:19):
M yeah, and I like I like the approach. Now,
what about those that are looking to remain anonymous because
they're afraid of doxy and things like that, and they're
handing over personal information for verification and things like that.
I'm assuming you handle the data encrypted and you do
(17:40):
away with it because you're not going to monetize based
off their their identity.
Speaker 1 (17:44):
Correct, Yeah, yeah, we're not We're not gonna we're not
gonna monitor it. So like literally, so with our verification system,
they'll they will upload the ID we verified usually by
the next business day, and then and then uh, once
it's verified or denied either way, then we we delete
the photo from our service. So we don't even hold
onto it or anything along those lines. So it's it
(18:04):
just it just gets completely deleted and wiped out. You know.
So when it comes to you know, the people that
want to remain anonymous again, like you can have a
pickax account and that's totally fine. But at the same time,
it's like, if we're going to really promote human human
to human interaction and all that, and lets unless like
(18:25):
literally you go set up a Rumble channel and you
start you start doing shows and start doing live streams,
or you have like a podcast or something like that
and you do it under that name, that could be
one way that you can remain quote unquote anonymous. You know,
I guess, I guess you know from from that standpoint,
But it at the same time, it can also be
very heavily abused if all of a sudden we open
it up to verifying anonymous accounts that yeah, we can
(18:49):
see you're a human, but how do we know that, Like,
you're not just gonna try to game the system. It's like,
if we want to promote human to human interaction, then
you know, it's like we there has to be some
way of being able to for the public just to know, Okay,
this is who they say they are basically you know.
Speaker 2 (19:06):
Yeah, and I agree with you, especially because it's it's well,
it's your platform. Those are the guidelines you're setting it.
And from everything that I've seen, everything is you've been
guiding yourself by the Constitution. I don't think I've seen
as many individuals like you be leading a platform where
you're strongly committed to the Constitution as your guidelines. I
(19:28):
think maybe one other one, But it's one of the
things that has come up a lot recently, even from
on top, about those on the right, those are supposed
to be the champions of free speech. Supposedly they want
anonymous accounts to be gone, they want actual names of
people on them. I don't remember who said that recently,
but then in combination with you messaged this out. Actually
(19:52):
an article you wrote Capital wanting censorship for like conspiracy
theories or clickbait and all that. It almost goes in
tandem because they're calling for censorship in that respect, and
they basically are calling for the doxing of such an
accounts that uh we all remember what happened in the lives
of TikTok. No one knew was China behind it, but
(20:13):
uh tayor Lorenz wanted to make sure that they could
be docs. And then there's a there's a lot of
safety and security implications that come with that.
Speaker 1 (20:21):
Yeah, well yeah, and so so this is this is
the thing, is that it's very concerning when you when
you're dealing with with conservatives and people that you know,
claimed to champion free speech all of a sudden talking
the way that Cash Ptel was the other day literally
as the FBI director. And I've had Cash on my
show before, and I've had a chance to interview him
and talk to him, and and you know, he was
(20:41):
promoting government gangsters and all that kind of stuff, and
it's like, you know, he's always been against censorship. He
was huge in pushing truth social because of all the
censorship that was happening on all these other social media platforms.
But now that he's the FBI director, and I can
see the frustration. I can hear it in his voice
when he's, you know, testifying before Congress and send it
of you know, like he you know, he's trying to
(21:02):
get information out there. You know, I think a weakness
of cash to be honest, is I think he's trusting
a lot of the people in in the FBI that
are the government gangsters that he wrote about. And so,
you know, like repeatedly, you know, he's being asked, did
you actually look at the files? No, I looked at
the report that was given to me by by by
my team. It's like, okay, but you know, realistically, have
(21:24):
they done their their due diligence? How how are you
checking their work? How do you know that they're following
up and investing? Are they investigating more? They just relying
on the documentation that the previous administration left you Like,
there's lots of questions, and so then people feel like, Okay,
you're saying that you know Epstein wasn't trafficking people. Okay, sure,
but like you know, you're you're the one that said
(21:47):
that that the FBI director has the black book with
all with all the names in it, and then all
of a sudden that's non existent in this administration. We
feel like we're being gas lit. There's legitimate questions that
people have, and I know, and I'm sure it can
be for somebody like Cash Battel, where whether he's telling
the truth or whether he's gaslighting, or whether he's a
useful idiot and being used by other people. He's trying
(22:08):
to get information out there and push a particular narrative
about all these different things. And then you have all
the quote unquote conspiracy theorists, and some of them are crazy.
I can attest to, and I can I know how
frustrating that can be. But also realize and remember this,
the reason why the conspiracy theorists have taken off over
(22:29):
the last several years is because you, as in the
government and the mainstream media have lied to us over
and over and over and over again about virtually everything.
Whether you're talking nine to eleven, you're talking about war
in the Middle East, you're talking about COVID nineteen, you're
talking about Epstein, you're talking about like literally everything. We've
been lied to over and over and over again. So
(22:51):
so don't complain when we don't just take your word
at face value, and we don't trust you when you
come to us as the government tell us something. And
for a lot of people, they're like, Okay, you lied
to us about this, this, this, and this. Maybe it's
probably if I'm gonna make an educated guess, it's gonna
be the exact opposite of what you said, because you've
lied to us so many times. It's up to Cash
(23:13):
Bettel and Pam Bondi and Toulci Gabbard to earn back
our trust, not you know, try to shut down people
that question their narratives.
Speaker 2 (23:21):
Yeah, totally. It's like, oh, they're sowing mistrust in our institutions.
I'm sorry you guys to yourselves. And it's not gonna
just happen overnight where we said I'm gonna trust you.
That's why there's been all this talk about Charlie Kirk
and all that stuff. And I know, and I even
said on my show, like within twenty four hours, I say,
I know what's gonna happen. There's gonna be a lot
of stuff coming out because no one's gonna like the answer.
(23:45):
And even when all the information comes out, there still
not be enough for individuals, especially because of well what
we saw. We saw an assassination and Charlie Kirk became
a martyr, and to us, especially those of the faith,
like we want answers like Lord, why why? So that
kind of stuff will always feed into people's desire to
(24:07):
learn more and in some cases push them to extremes,
as we're seeing happening right now. So I totally get that,
and there's this weird balancing act that needs to be done.
But we're talking about playing around a various slippery slope
in terms of taking on the First Amendment. And that's
what the patriotch was trying to do because it was
for our safety, our security, we give up some more
freedoms for that. Look how quickly it was turned around.
(24:28):
What we're seeing literally play out right now with the
help of AI is the movie Captain America Civil War.
I cannot think of a better movie that describes that
in pop culture than that one, because more people I
have seen it too. But it's like, Oh, it's meant
to go after the bad guys, yeah, but hydro infected it.
So now it's the other ways. In the target all
(24:49):
the good people, and it's judging you based on your DNA,
your genetics, your social credits. Like, wait, that sounds familiar.
It's that's all happening right now. We have AI being
injected into our government in ways that we've never seen,
from the DoD to health and I'm like, dude, what
is No, that's not what we're supposed to be doing.
(25:10):
And you've been very very vocal on that. Palantine being
one of those that I was very concerned about right
around the election. One of the concerns was all those
big tech hawks circling Trump like, hey, we want in.
Mark Zuckerberg infamously trying to get in on the crowds,
Like dude, all of you guys tried to erase his existence.
And here you guys are like, hey, take my money
(25:31):
because there was no way they could steal this election anymore. Okay, well,
now was it? How much of Intel does the United
States government now own and shares ten percent?
Speaker 1 (25:43):
That's massive when you think about like Intel, like ten
percent of Intel, that's insane.
Speaker 2 (25:48):
I I'm not even That's what leads me speechless. I'm like, no,
I was against during the bailout the government owning part
of General Motors, And what happened when Jenna more One
pay back the shares so that they could buy out
the garment. It's like, nah, we'll stay here. And what
happened if they're dictating what kind of cars to be
made that nobody bought, what they will break down and
things like that. It's like, but now we're seeing at
(26:10):
levels like infecting our technology that we have. So this
whole thing about concerns about privacy and security and we
have volunteer and open AI in conscience with the Pentagon
on the D O D and all that. How much
of a slippery slop up? We're looking at what what
people don't understand about AI? I guess let's start with that. Uh,
because you're more in the tech space than I am.
(26:30):
So what what is AI? Because people might think as
a Google assistant maybe like Ciri or Google Assistant or
or Alexa, But what exactly is AI? Can you speak
to that?
Speaker 1 (26:42):
Yeah? So I mean so, I mean so basically you
have like a large language model. It's just basically just
like a massive database. It's all the information and data
that that this AI will will pull from. And so
the bigger the bigger the database, the bigger the large
language model. You know that you could argue the better,
the a I would be A lot of it also
(27:02):
does have to do with the kind of programming and
the and the bias that you write into it. And
anybody who says that AI is unbiased is insane, you know.
But because it's only it's only as good as the
programmer and only good as good as the database. And
the majority of these AI platforms they're pulling either from
like acts like social media, which is a diverse you know,
you stet of beliefs, or the pulling from the mainstream
(27:24):
media for for for reports, and so this is this
is the problem that that we face, is that a
I can can be easily manipulated. If if you if
you remember it as a couple of months ago, Grock
went crazy and started posting all this anti semitism and
pro Nazi stuff and pro Hitler and all and all that.
That just took a couple of key strokes. It was
not a complicated thing to be able to make that,
(27:46):
to be able to make that change and have it
go crazy and and I And this is what I
keep cautioning conservatives about because you know, this is what's
strange about what's happening right now. Bernie Sanders just came
out with his with his report on the Future of
Artificial Intelligence. Literally, Bernie Sanders is like, in the next decade,
it's going to take a hundred million jobs away from Americans.
(28:07):
So there's gonna be one hundred million jobs lost because
of artificial intelligence over the next decade. I would say
it's probably more if you listen to Elon and Bill
Gates and all of that, but let's just go with
one hundred million. And then you have Donald Trump over
here fast tracking it. Like we tried to get in
the big Beautiful bill where let's completely deregulate the entire
AI industry, which is bonkers, and then when there was
an outrage, they got that pulled from the bill. And
(28:29):
what does Trump do. He signs an executive order basically
doing the exact same thing. And you're like, okay, so
why am I a constitutional conservative finding myself more in
agreement with Bernie Sanders on AI than Donald Trump? Which
is absolutely insane. But this is, you know, to kind
of bring it back to what you were talking about,
like with Intel and the Trump administration getting ten percent
(28:51):
of intel for the federal government. Now we've set a
precedent with Intel. This is a tech company, right and
you know, they're building the infrastrutructure in order for you know,
for the implementation of artificial intelligence in within our own government.
So what's to stop if they can do if they
can get ten percent of Intel, what's to stop them
from getting a ten percent of Grock or a ten
(29:13):
percent of Oracle or ten percent of any other platform.
And then these platforms are collecting data on the American people.
You know, now you're getting into privacy issues where you know,
data is king right now, and so you know, it's like,
we need a separation of the private entities and the
federal government, especially if they're going to be regulating, especially
if they want to be neutral, especially if they want
(29:34):
to pass legislation that's equal to everybody, which they rarely
actually do, but there should be some semblance of neutrality
in our government. But now, if you have a vested
interest in seeing the success of Intel, then you're going
to actually pass legislation that's going to benefit them and
probably give them contracts over their competition because you don't
have an equity stake in these other companies. And it's
(29:55):
interesting because I got into a I went on Matt
Gates's show a few weeks ago, and we debated this
very issue, and Matt Gates was actually taking the approach
of supporting the government getting two percent of Intel, and
he actually was calling for the nationalization of Boeing to
where the government should take one hundred percent of Boeing
and I and I took it back to the our
(30:17):
founding fathers, Like if you go and you look at
history of our government owning parts of companies. This has
happened basically since the early nineteen hundreds. Usually it's like
a temporary thing where the government will come in, they'll
they'll give the money, they'll get a stake in it
until they either get the money out or to sell it,
you know, kind of like what they did with the
bailouts with the vehicle industry or at the car industry.
But that didn't happen until the early nineteen hundreds, which
(30:39):
is when we saw this massive explosion of the federal government.
Before that, there was only one instance before the nineteen hundreds,
there was one instance where the government had a stake
in a private entity, and it was actually at the
foundation of our country, and it was the first and
second national bank, and so it was very divided amongst
our founding fathers because they wanted to establish some kind
(31:00):
of bank that would have that you know, it could
it could provide loans to companies and corporations. But Alexander
Hamilton convinced George Washington to actually sign this legislation to
where the government would get twenty percent of the bank
in order in order to get basically give give credibility
and in credence and reliability to this basically centralized bank
(31:24):
for the foundation of this country. You know, guys like
James Madison, guys like Thomas Jefferson, a lot of the
founding fathers that we all respect staunchly opposed this and
there was a big fight that was happening. And what
was interesting is that if you if you track this
and again, this is just a bank right where you know,
it's like, you know, twenty percent went you know, was
owned by the federal government. By the time Andrew Jackson
(31:45):
became president, he saw so much corruption that was happening
in the sense of loans that we would be giving
to certain people and certain people weren't having to pay
back loans. And it was basically almost like this embezzling
scheme that was being used and abused by the politicians
and the leads and all that kind of stuff. So
Andrew Jackson demolished it. When he ran for president, one
of his big things was was getting the government out
(32:07):
of this private bank because of how corrupt the system
became because the government had twenty percent ownership of this bank.
And so then after they dismantled the Second National Bank,
then we never had another ownership of a private entity
by the federal government until the early nineteen hundreds. So
you're talking about one hundred years where that did not happen,
(32:28):
and then now we're seeing it happen more and more
and more. And then now you're having constant, supposedly constitutional
conservatives now defending the federal government owning private private you know, companies.
And you're just sitting here like, so if you can,
if you can know on ten percent, what's to stop
you from owning twenty percent or twenty five percent, thirty
percent or fifty percent or one hundred percent. Like now
(32:50):
we're getting into full on like socialism fascism type stuff,
and we're supposed to be the anti socialist fascist side,
you know, Like it's bonkers when you actually think about it.
Speaker 2 (32:58):
Yeah, and this kind of ties on a little bit
too because you have a social media platform. I'm sure
you were following very closely. What was going to be
the faith of TikTok, like from from your outset, because
I had my perspectives, but from your outset, what what
you saw play out? And ultimately TikTok can remain Who
(33:18):
are the owners? Now? I think what was Oracle or Oracle?
Speaker 1 (33:22):
Oracle is going to do all all of the infrastructure,
which means they get to collect all the data. And
that's the thing about Oracle. Oracle is one of the
biggest data collectors out.
Speaker 2 (33:29):
There, and this is all benefits to SIMP. But this
was a we blame. It was a Chinese CCP thing
and it was spying over here. It's like, well, just
don't don't, don't use it, that's all you got to do.
But now you have the federal garment saying you're out
unless you do what we tell you to do. And
now it just seems it's very beneficial to have freaking
(33:51):
a It's Oracle that ended up getting this and access like, look,
there is a big change with Elon byen Twitter now,
but let's not kid ourselves. He bought it for the data.
I said that very early on for his neural link,
for his star link for groc and now Tesla robots optimists,
(34:12):
and I think it was Deuys's designer who said that
the goal is to have two to three optimist robots
for every human. I'm like, what why, It's like, dude,
you guys just kind of I thought that was someone else,
not you guys. But it's scary real. But so, how
did you see this whole TikTok thing? Gole down. I'm
sure you were found it very closely.
Speaker 1 (34:34):
Oh yeah, for sure. Well, I mean Trump was even
proposing that that he wanted the federal government to ewarn
fifty one percent of TikTok like that that was that
was his original proposal. And I'm just sitting here like,
so now you want a government owned social media platform
talking about propaganda and this is the thing that this
is the thing that I keep coming back to, and
it's like it's like falling on deaf ears to conservatives,
is okay, let's just hypothetically say that everything that Trump
(34:56):
wants to do it is great, right, He's he's not
gonna abuse the system. He's not gonna do all that
kind of stuff. So let's just say, okay, we get
fifty one percent of TikTok. He's not gonna censor people.
Let's hypothetically say, you know they're implementing AI everywhere. You know,
they're not gonna use and abuse that RFK Junior is
not going to use and abuse AI in in the
HHS for FDA approval process and all that kind of stuff.
(35:18):
What happens when Gavin Newsom runs for president and let's
say he beats JD Vance in this next election, do
you think they're gonna weaponize it against you? Then? Then
are you then are you gonna regret implementing artificial intelligence?
Because let me let me ask you if let's say
that AI because they're because they're not just using AI
to analyze data, they're actually gonna be using AI to
make decisions and deal with the FDA approval like and
(35:41):
all that. What if AI makes a mistake or again
quote unquote makes a mistake, like who's held responsible? It's
not gonna be the coders. They're just saying, oh, it
was a glitch, sorry, Like there's to be zero responsibility.
But if there's a human that's making the decision. You
can hold them responsible, you can sue them, you can
get them fired, the you can arrest them, you can
you can do all different kinds of things to somebody
(36:01):
who violates your constitutional rights. But so this so so
when when we're dealing with the implementation of AI everywhere,
like this is a major this is a major problem
that we're dealing with, but specifically with with with TikTok.
It's one of those things where I don't like the
idea of specifically targeting one company that this that the
CP that the CCP had basically had a stake in
(36:22):
targeting one company and ignoring the rest. To me, it
almost felt like they're throwing TikTok under the bus, probably
because there there was people like Larry Ellison and Oracle
and all these different people that would that would love
to swoop in and buy a social media platform for
all the data and all that kind of stuff. But
but they they throw this one company under the bus,
Like do you realize how many Hollywood studios are owned
(36:44):
by China, how many how many publishers China, how many
Silicon Valley companies are partially owned by China? All the
importing that we do, and then you're taught and then
the claim is that the Chinese are spying on us. Okay,
like like really, what are they spying? Like when you
think about it on TikTok, they're spying on kids dancing
and doing conspiracy videos and like pop culture kind of stuff,
(37:09):
you're like like, like when you actually think about like
national security threat, TikTok is very low on my totem
poll of and of a national security threat. But but
the fact that they were that they did this, and
they did under the guise of national security, but then
they ignore the real national security threats of the influence
that they have over us, and you know, all the
data collection that they're doing, and we're making deals and
(37:31):
we're bringing hundreds and if not thousands of Chinese students
over to learn in our school so that that way
they can go back over it like when you're when
you literally think about it, like this was a really
really bad thing that we did in forcing them out.
If you're going to do a general principle I've I've
long called ban China from doing business in the United
(37:52):
States of America. There are enemies we should have, we
should we should not be allowing that we should not
be allowing immigrants, We should not be allowing their students
to come over here to our colleges and learn. We
need to treat them as the enemy that they are.
So cut them off all together. But don't just pick
and choose one company and act like, oh, yeah, this
is a national security threat. We need to take it over.
(38:13):
And then you give it to the biggest data collector
in the country like like like and this is the
thing that Kennedy said, you know, when he was running
for president. He's like, you know, everybody's complaining. Everybody's complaining
about China and data collection and all these foreign He's like,
I'm more concerned about our own government collecting our data.
I'm more concerned about the CIA doing it. I'm more
concerned about these big tech oligarchs doing it than China
(38:35):
doing it. Like like, what's like, when you think about it,
if we would just cut China off, they don't really
gain anything by collecting our data. Like when you actually
think about it, they wouldn't really gain anything with TikTok
if we would just cut off their ties to the
United States of America. This is like the whole thing
felt like we were being gas lit.
Speaker 2 (38:52):
In my opinion, absolutely, because if our government actually cared
about that CECP threat, Eric Swawa will not be anywhere
near con if anything, be probably behind bars, but just
nowhere near Congress. The guy slept with the CI the
CCP spy Fang Fang, and he was sitting on what
(39:12):
it's like intelligence committees and all this stuff. You can't
make this stuff up, but yeah, and I totally agreement
with you there because like I had to very respect
respectful friends, but I told him I am not in
favor of this. It's a very slippery slope, sets bad
precedent and picking and choosing something that all we got
to do is just not use it. Like I'm sorry,
(39:33):
I don't see the gain that TikTok has except for
the influence they have on our youth, so that means
better parenting. That's all that I saw. So now that's
all has transpired, and this iss transpire just the past
few months.
Speaker 1 (39:45):
Man.
Speaker 2 (39:46):
That's why it's crazy, Like for the last time you
and I talked to her, we are now it's like
we were speaking about the future and now we're like
we're living in it already. And how this whole. AI
plays into everything because it's it's basically allowing some kind
of a machine learning thing to make decisions and uh
pallunteers being used by DHS. I believe in these ice
(40:08):
rates to identify individuals, So that means that AI has access.
Now they give an access to databases and the government
has all the whatever information on us, like T say
clear and all that stuff is all there and whatever
the big tech companies are giving them, and now volunteer
(40:30):
and open AI are most likely getting access to all that.
Yeah that's uh yeah, no bueno, man, that's all I
could say. It's not bueno, and it's under our guy.
Like I voted for him. I didn't vote for this,
but it's it's wild to see all that.
Speaker 1 (40:46):
Man. Well, it's it's really interesting, especially dealing with Trump
because like again, like I voted for like everything not
its like I endorsed RFK Junior during this last election
cycle and then when Kennedy came back in, I backed
Trump and I voted for him, and I don't voting
for him. But when i'm but I try to be
very intellectually honest about about things, and and you know
(41:07):
I would I would never in a bajillion years vote
for Kamala Harris. So I'm just prefacing what I'm gonna
say with that. But also, like Trump, Trump has a
lot of good gut reactions, but he's not a constitutional conservative.
I don't. I don't think anybody can legitimately make that
claim that he's a constitutional conservative. He's kind of a
big government Republican, and he he makes gut reactions. Some
(41:29):
of them are good, some of them are not, and
all that. But the thing that the things that they're
doing with AI specifically could be way more catastrophic to
the future of this country than anything that Kamala Harris
ever would have done. And and that that that's like, like,
we're looking at this, they are setting up for the
complete overhaul of this country. They are setting it up
(41:52):
for all of our constitutional rights to go out the window.
I mean, Sam Altman, the CEO of of Open AI,
has already said that that the AI world does not
mesh with the constitutionally with the constitutional world of the
United States of America. So we're going to have to
have a new social contract, is what he is arguing for,
because we can't live with constitutional rights in this new
(42:13):
era of AI. Where you don't have any privacy, you
really don't have free speech rights, you don't have all
this kind of stuff. And so like you, when you
think about it, it's like, sure, under the Trump administration,
and maybe theoretically as long as there's a conservative or
Republican in office, they're not going to take away our rights,
although we see this happening all the time. But you
could make the argument, Okay, they're not. They're not going
(42:33):
to abuse the system and all that. But like I say,
as soon as Gavin Newsom gets in there, which he's
going to do everything in his power to beget in
the White House, now, the Trump administration has set him
up perfectly to take away all of our rights, like
with the implementation of this technology. So at what point
are we going to sit back and really like, Okay,
maybe I need to think beyond just this presidential term
(42:57):
and I need to think five years down the road,
ten years down the road, twenty years down the road.
What precedent are we setting? Are we setting ourselves up
for success or failure? And the one thing that I
keep getting told all the time, you know, Jeff, you
really need to knock it off because we have to
beat China in this Ai rice and I'm like, no, no,
we don't. That's mutual destruction. That's like saying, well, we
(43:18):
need to drop a nuclear bomb before they drop a
nuclear bomb, and then we're all gonna die. And I'm
just like, it makes no logical sense. The way that
we beat China is not with more tyranny and more
centralized control. The way that we beat China is decentralization
and not relying on this kind of centralized technology that
they can use and abuse and they can beat us
with artificial intelligence. If we were decentralized, they couldn't do that.
(43:41):
Like for example, there was this big story the FBI
did a big raid in New York because they found
hundreds of thousands of simcards that they could use to
automatically send out tens of thousands of text messages in
a matter of seconds and completely shut down the entire
infrastructure for the cell tower and all that kind of
stuff in the entire city. So, because we have this
(44:04):
centralized infrastructure set up in this country and it's becoming
more centralized as time goes on, it only takes one
thing like that to shut down the whole thing, whereas
if you actually had competition, and you were we were decentralized,
and we were more focused on local infrastructures and all
that instead of trying to nationalize everything. We're actually much
(44:24):
more naturally secure, and it would be much more difficult
for AI to destroy our country or take over and
all that kind of stuff if weren't so reliant on technology,
Like what difference would it matter if AI was going
was going crazy and all that kind of stuff, if
we weren't so reliant upon technology and going to digital
currency with everything and a digital ID and all this,
Like we are setting ourselves up for failure. And it's
(44:47):
like everybody just has blinders on and it's like, oh no,
we're not We'll be safe. It's this invincible, this invincibility
that I'm very very concerned about, because like it's it's
not going to be that much further down the road
when things just go haywire totally.
Speaker 2 (45:00):
And that's what I saw being speed rolled as soon
as COVID hit, which the China let loose. So I
always found it curious that I went to my bank
and there's a coin shortage, Like how who did the
bank run? We didn't even get any information that there's
(45:21):
a concern about it, and then all of a sudden,
all these places are going cashless, like theme parks. Oh,
we're a cashless thing, and you have cash here put
into this machino converted into a car that you can
use in the place. It's like, we're a cashless venue.
I heard Eric Prince say it on in one intriview
that he did i'm know with Sean Ryan or someone,
where he said that it is constitutionally illegal for a
(45:43):
company to turn down the US dollar because it's legal tender.
So all these companies that are saying no cash here
are technically in violation of constitutional lodge that no one's
actually enforced it. And I like, good point. I know
that it probably won't the person behind the counter, they're
not the ones gone to shots. I'm like, you gotta
take my dollar. He's like, oh, management says that I can't.
(46:03):
But I know that more as a collective whole, we
can do a bit of a pushback on it. But yeah,
that helped us speedball into this more digital space. And
now we're five years down the road. Look where we
are with AI being in now as scary and everything
that sounds terrible about it that we've been talking about.
We've been talking about it being used to implement in
(46:26):
a place where it's a double edged sword, like yes, okay,
if if it's done for the good, then sure, fantastic.
But if that sort was in the hand of somebody
else that we don't like, they can be used against us.
But AI itself, I'm seeing it. I use a I
don't know. I don't lie to people, say, dude, I
use different services of AI to help with my short
production and content creation. It helps a lot, especially when hey,
(46:49):
you're working on a budget, so that helps. How then
do you see because I'm sure you saw the announcement
Perplexity is partnering with Rumble and rumble Cloud, and rumble
Cloud is on infrastructure and all that, and they're actually
the backbone of true social and even more to that,
So what what do you see AI of aiding in
(47:10):
the implementation this partnership with Rumble Do you have any
insight to that?
Speaker 1 (47:14):
Yeah, I don't have like specific insight into its. It'll
be interesting to see how they kind of like roll
it all out. But I am it's it's a little concerning,
But again, I think I have a pretty extreme worldview
when it when it comes to AI, and I don't
fault anybody for for using it necessarily because like you
said that, like if we want more people to be
(47:35):
entrepreneurs and individuals and all that kind of stuff, you
can only you only have so many hours in the day.
Like like for me, I always say, like I only
have so many, so many hours in the day. So
like a lot of times, like ill we we use
we use AI within Pickax, and I think the only
implementation that we have using it right now is for
rooting out pornography and nudity on the on the platform.
And so then that way, if somebody's posting a picture, uh,
(47:56):
it'll it'll immediately recognize whether they're whether it's pornographic, and
then it'll it'll shut it down. So that that's the
only implementation of AI that we use with within within
the platform, is just that instant recognition of it, uh.
And then I will I will use uh chet GPT
for like editing my articles, you know. So that was
so that way I don't have to like try to
(48:17):
find somebody that can help me edit and all that
kind of stuff, and I don't have time to read
through it five times to make sure I did punctuation
properly and all that kind of stuff, So I like,
I will use it, and I always, I always tell
everybody on a micro level, AI is valuable, like on
the on the on these specific instances, you know, in
the application of it and data analysis and research and
all like, AI is valuable. And and I don't I
(48:38):
don't discount that at all. But where these guys are
trying to take it on the macro level, that's where
all of this is dangerous. And so when I'm looking
at what what Rumble's doing like and again I'm saying,
this is somebody who's like, we're fully integrating with Rumble,
we enjoy Rumble, all that kind of stuff. But I'm
mostly keeping an eye and seeing, Okay, how are they
how are they going to use the AI, because I
don't want the AI to imp act what we're doing
(49:01):
on Pickact. So it's gonna be interesting to see how
they kind of roll it out and integrate it and
all that. But I think I just want I want
conservatives and people that love freedom to just be aware
that with if everything, when it comes to technology and AI,
everything comes at a price. And and I used to
use this illustration all the time, and I don't as
(49:22):
much anymore. But there was an old ABC show called
Once upon a Time that that was out for a while,
and it was all the Disney characters, the you know,
like live action kind of a thing. Yeah, and uh,
And there was this one character in that show called
called the Rumplestiltskin and he was kind of that central
figure in the show. And and you know, when people
would always come to him and say, like, you know,
(49:43):
should can we do uh, you know, should we use
a magic spell for this? Or should we do you
know something like that? And uh, And literally he would
always say, and he was kind of the villain, but
he would, but he had this wise line of like
it's like magic always comes at a price. Question is
what is that price? And so you have to weigh
is it worth the price? And so the thing is
(50:05):
the thing with AI. There's a lot of amazing things
that artificial intelligence can do, but there's also a lot
of really horrific things that artificial intelligence can do. And
and my question is this, and I kind of lean
towards it's not justified and it's not worth it, because
the catastrophe that AI can do to humanity will literally
(50:27):
end humanity as we know it, if we if we
just trust Sam Altman and Larry Ellison and Elon Musk
and all these guys when they're telling us the direction
that AI is going to take us, We're not going
to have humans in the workforce anymore. You're no longer
going to ever need to learn from your mistakes. But
because you're never going to make a mistake because AI
is going to do everything for it, you just instantly
(50:48):
think about something and you can three D print dinner,
you know, just by thinking it. You know, like your
brain's connected to the internet, so you no longer have
to critically think they're taking away our humanity. And I'm like,
is that worth the cost? Like is that cost worth
get you know, being being able to do all these
all these amazing things that artificial intelligence can do. Like
literally they're sucking the humanity out of us. Like, when
(51:11):
you actually sit there and think about it, is that
really worth the trade off? I would argue No, But
that's just me.
Speaker 2 (51:16):
No, I'm with you there, man, I'm with you, because
it's not like, oh, it's gonna be this massive structure
of humanity. No, we lose our humanity quite literally we
all become what's the movie Wally shows. No, we're just
riding around and not doing anything and we're just unhealthy slops.
So yeah, that's uh, it's prophetic. It's prophetic in a way,
(51:38):
and I think that's a that's a good way to
to close this out. So I'm gonna let you have
the last words, since, just like I gu said, it's
a good place to close out. What can people expect
more from pickcas Where can they follow you and learn more?
Speaker 1 (51:50):
Yeah, honestly, the best place obviously to follow me is
on pick actual. We guys got p se kax dot com.
I actually we're testing out a new feature right now
that I'm kind of in the final phase of my
initial beta testing, which is an email newsletter feature that's,
you know, based on I've completely switched from substack over
to pickaxe, and so all of my emails and articles
and newsletters and everything come directly from pickaxs. We're gonna
(52:11):
be rolling this out more extensively later on later on
this year, But you guys can actually subscribe to my
newsletter on my pickaxe account, So just look up Jeff
Thronic on Pickaxe. You guys can subscribe you get all
of my articles just email to you in your in
your inbox. But I think, I think for me, you know,
I'm encouraging everything, not just to be self serving as
the CEO of Pickaxe, but but like if you actually
(52:34):
want a voice, and you actually don't want these big
tech companies to just collect your data and see you
just as as as a you know, small little cog
and this big machine or whatever it is, but you
actually want like individual rights and individual freedom, Like Pickaxe
is a platform to go and you know, because because
I'm not going to get in the middle between you
and your audience. I'm not I'm not going to play
that that idiotic game that these big tech companies are playing.
(52:56):
And so so we've got a lot of plans of
a lot of things that going to be rolling out
over the next couple of years, and we're going to
systematically take the control and the power the big tech
has over us. We're going to take that away and
give it back to you the people. So like we
are going to use Pickaxe as a centralized hub and
then all these different things that we're going to develop
(53:17):
are going to flow through that, and it's going to
streamline things make things easier for you, be very human centered.
Like that's the direction that we're going. And then one
other thing that I just want to throw in there
as well. We also for content creators that are like
rumble streamers and bloggers and all that. We actually just
hired a really awesome guy, Trevor Shipman. He's now our
director of Creator Success, and his job is to help
(53:40):
you be good at your job of being a creator.
So it's like perfect your craft, get better, how to monetize,
how to write articles more effectively, how to live stream,
what are the tools that you need, what things can
you learn? All that kind of stuff, and so so
we're actually he and I are sitting now, we're brainstorming.
We're putting together like a program together to where it's
like we're going to do like an annual creator something.
Get everybody the same room, let's just like mentor people.
(54:01):
Let's we'll do like regular trainings teachings, and our job
is to help you be better at your craft. And
so you know, that's something like nobody else was doing,
and we're like, let's actually, you know, do something like this,
so that that's kind of what we're doing. I highly
encourage people check it out, go to pickax dot com,
and then we're launching the app here in the next
couple of weeks.
Speaker 2 (54:18):
So I love it's beautiful, I love the philosophy. Jeff
always a pleasure, and I'm sure we might do this
again sometime and hopefully we still have some humanity left
when we connect again.
Speaker 1 (54:28):
Oh yeah, no, yeah, we got to set up our own,
our own, separate ecosystem. That's human first, that's what we're
doing pick X.
Speaker 2 (54:34):
Absolutely, Jeff, well, don't hand up people. We're into recording here.
Thanks very much and I'll see you on the socials.
Speaker 1 (54:40):
Thank you.