Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Fire up those VPNs, put on your tinfoil hats, and
find yourself a friendly neighborhood hacker, because they might be
all we've got. Now it's time for suggested articles a podcast. Well,
(00:32):
hello everybody, it's me two F Jeff coming to you today,
almost but not quite, all by myself, because one F
Jeff had some stuff come up, Aaron had some stuff
come up, So I got onto the dark web and
hit up some of my frequent shady haunts. And who
(00:54):
was there just hanging out on a corner of the
dark streets was Neon Chaos. Neon Chaos, thank you for
agreeing to come keep me company today.
Speaker 2 (01:05):
Of course, although I should point out you're never technically
alone in today's panopticon of surveillance.
Speaker 1 (01:13):
I absolutely agree with that. But there's alone and being watched,
and there's a loan, and you can have a conversation
with whoever's watching me. Yeah.
Speaker 2 (01:24):
The NSA personnel generally doesn't talk back.
Speaker 1 (01:27):
No, no, not typically, And I know that they're there,
we've talked to them before. I can't remember your name
right now, I'm sorry, but your van is still across
the street. So I hope you enjoyed the show. Yeah,
so interesting times. We recorded our last episode and posted
our last episode, right, I believe it was February nineteenth, sorry,
(01:50):
January nineteenth, so it was the day before the Trump inauguration,
and that was approximately eight months ago.
Speaker 2 (02:00):
Never take a couple of millennia.
Speaker 1 (02:02):
Yeah, I don't think I've ever felt so exhausted in
my life. How are you neon chaos?
Speaker 2 (02:11):
I'm also a little overwhelmed doing my best to make
sense of the chaos, No pun intended. I will say
that I share your feeling. It's a lot like drinking
from the garden hose. But what I will say, and
I've seen this post at a couple of different places, the
(02:35):
deluge of things coming out of the White House, I
feel like is by design. The idea is to overwhelm
everyone with what do you focus on and essentially exploit
the low attention span that we've all kind of cultivated
over the past decade and change to basically make it
(03:00):
impossible for any of us to react or you know,
mount any kind of meaningful resistance or defense against the
I'm just going to say draconian, malicious, and nineteen twenties
esque return to form. I will follow up on that
(03:28):
and I specifically highlight the nineteen twenties. There was a
post that I saw on masted On, which I think
we'll probably get into a little bit later in the
episode from someone and I checked it out, and this
actually is true, at least as far as I'm able
to confirm. So there's this story basically that when Hitler
(03:51):
was rising to power during Nazi you know, the burgeoning
days of Nazi Germany, he actually had sent government lawyers
to the US, specifically with an eye toward looking at
how the enforcement and structuring of the Jim Crow era
laws were being enforced in the South and the Nazi
(04:13):
I want to be clear, these were Nazi attorneys. These
were guys who were true believers in the fold, genuinely,
you know, wanting to be standard bearers for this movement,
for this party, for everything that it stood for. For
the fere returned back to Adolf Hitler and told him
that what they had seen was too extreme for them.
Speaker 1 (04:37):
Well, and if I'm recalling correctly, and I think I
learned this somewhere along the way, that's not the only
way we inspired Hitler and his rise to power in
his programs, Because America was doing eugenics.
Speaker 2 (04:51):
Before John Harvey Kellogg, the guy who created Kellogg's cornflakes,
So hang about that the next time you dig into
a box. He was a major proponent of of the
eugenics movement. And it's also important to remember the US
actually was not unanimous in their condemnation of what was
coming out of nineteen thirties. Third Reich Germany. Charles Lindberg
(05:17):
very famously was a staunch Nazi supporter, as was Henry Ford. Yeah,
both of them were very ardent supporters of the fure
in his programs and what he stood for and basically
felt that the US should ally with them and you know,
throw them a few bones.
Speaker 1 (05:38):
Yes to all of that. Our history is very muddled
that way. I look, I mean, now, okay, there's another
side to this. For example, I was catching up on
The Daily Show because I do think it's interesting seeing
some of these comedians take have their take on what's
going on. John Stewart made some good points, but it's
(06:06):
sort of telling us not to panic, which is not easy.
But his point was like, yes, Trump said he was
firing all these inspectors general, but he can't actually do that.
They have to get at least thirty days notice, and
so on and so on. Yes, he can try to
block funding for certain things. Yes, he can declare that
there's no such thing as birthright citizenship. But courts are
(06:29):
already stepping in to block some of what he's doing
because it's not part of the powers of the president,
and he can't just throw the constitution out the window.
And that is all true, but it's that chaos thing
you're talking about, right Like, some of these things are
going to slip through the cracks and some of them
are just going to be underreported on.
Speaker 2 (06:50):
Well, and let's be clear, I mean in terms of
what's happening right now. I mean, so you have this
memo or this executive order, a memo basically that came out.
We'rescinding all federal funding for aid programs foreign and domestic,
you know, So we're just again to be clear for
the listeners, let's enumerate what some of those are. Snap EBT.
(07:14):
This is a program traditionally known as food Stamps WICK,
which is a program specifically targeting low income mothers and
young children. And when when I say young, I mean
zero to three years of age. This is not like
you know, mother in her thirties with a kid who's
in his teens. It is infant to toddler. And that's
(07:37):
you know, the extent of it. Veterans Affairs is technically
included in some of that, specifically around healthcare and that
kind of stuff. You know, these are are pretty big
programs to say nothing of. And I actually saw this
this morning. The entire US AID website has been taken down.
(07:59):
And for those that don't know USA, it is a
program that was started in nineteen sixty one. It was
signed into law by John F. Kennedy. Congress passed it,
jfk signed it. And this is a program that puts
billions of dollars out into the world for development. This
was a program that, as mentioned, was stood up in
(08:20):
the sixties and it was supposed to be effectively what
the Belton Road Initiative is now for the US to
build goodwill, to help increase infrastructure, build out emerging economies,
build allyship with other countries in this kind of East
versus West ideology, communism versus capitalism, the stalemate that we
(08:45):
had during the Cold War. That website has been completely
taken down and there are questions now about whether or
not those funds will continue to go out. It is
worth pointing out that as of recording, Elon Musk now
has complete control of the government's payment systems, So things
like Social Security, things like Usaid, things like veterans benefits,
(09:08):
things like Medicare, Medicaid, defense spending, all of these payment
systems are now under his purview to basically do with
as he pleases.
Speaker 1 (09:21):
So well, I believe he's going to be integrating all
the government payments and stuff into X the everything app.
He wants that to be, you know, where we go
for all of our banking. So it's probably just you know.
Speaker 2 (09:36):
Well not just that. And I'm not sure if you
saw this. I saw this actually yesterday the FAA the
topical well, because of the plane crash that happened in
the DC area, the FAA announced yesterday that all of
their official updates going forward for everything, not just the
immediate recent plane crash, but everything going forward will exclusive
(10:00):
be posted to shitter.
Speaker 1 (10:02):
I see, I didn't know that one.
Speaker 2 (10:04):
Yeah, So I mean, let's call it what it is.
The cult of personality that put Trump in the Oval
office allowed him to hand dictatorial power over to what
I am calling the first foreign born President in complete
violation of the US Constitution. Elon Musk and to your point,
(10:25):
specifically John Stewart's point about yes, we have the safeguards
and these guardrails in place that are supposed to act
as bulwarks to prevent this kind of dictatorial and authoritarian
subsuming of power. That only works if the other branches
of government assert themselves as being coequal. And so far,
(10:47):
what I have seen since January twentieth is we effectively
have an autocrat who is amassing and centralizing power in
himself and his staff, and his loyalists and his sycophan
in his cronies, and the Congress that is aligned in
terms of political party, but all too eager to give
(11:09):
up the reins of power to this individual and whoever
he then delegates his will to be carried out by.
So saying yes, we have three co equal branches of
government is a little bit like saying hinz has fifty
seven different flavors in one bottle. Technically, that's true in
(11:30):
that there's fifty seven ingredients, probably, but you're only getting
one product. And in this case, the product that we
are getting is the great American experiment. In my estimation
being retired.
Speaker 1 (11:48):
Funny enough, in a book published probably if I had
to guess, between fifteen and twenty years ago, America, the
book of which was put out by The Daily Show.
They made a joke about how we couldn't possibly have
a dictator unless, you know, because we have all these
checks and balances, and unless, of course, if power got
consolidated in all three branches, and then you know.
Speaker 2 (12:12):
I believe America the book was released, and I'm confirming
it right now, but I believe it was released in
two thousand.
Speaker 1 (12:18):
And four, twenty years ago. There you go. Yeah, well
twenty one. But okay, so this is sort of a
tech podcast, and I did have Elon on my agenda.
So since we're already talking about our possible new emperor, boy,
I do not want to end episodes hailing him. But
(12:40):
we'll see how things go since we're talking about him. Anyway,
there was some reporting that I didn't quite get to
the bottom of, probably almost a week ago, about some
of his employees showing up in some government offices, and
what I had read initially was quote plugging in hard
(13:00):
drives to government servers. Now that says off some security
red flags. Do you know anything about that. Is this
coming up a lot in the hacking world on the
dark Web, you guys all talking about.
Speaker 2 (13:13):
This inasmuch as we're able to make you know, wheat
from chaff of everything that's going on, I haven't seen
it belabored too much contrary to what you might have
seen in television or film. Generally, and I'm saying generally
because it's not one hundred percent the case, but generally,
(13:36):
the standard out of the box Windows Defender protection suite
that comes with your run of the mill, off the
shelf computer generally doesn't allow auto execution of you know,
plugging a hard drive and it immediately starts running a
script that copies everything from the local drive to this
external one.
Speaker 1 (13:55):
Okay, I have seen that in TV and movies.
Speaker 2 (13:57):
Yes, yes, in reality, and again i'm generalizing here in reality,
executing something like that is not going to happen in
ninety nine percent of cases. I will say not to
say that it can't, because you could very easily, you know,
(14:18):
register something as an auto launch, you know, plug it
in a way you go. But realistically, I think the
bigger threat is not necessarily the data being copied off,
because at this point whether it's you know, Bernie from
the mail room sent with a handwritten napkin note for
(14:38):
Elon versus someone with an official enforceable document and a
warrant from you know, a judge in the pocket. Realistically speaking,
it doesn't hold a candle to the power that Elon
now assumes, having the control he does over the OPM
computer systems as well as acts to the Treasury like
(15:03):
this ultimately becomes the bigger harbinger of danger because most
of that data. Let's be honest, it's government. Government is slow.
Government doesn't do a great job because of how stratified
it is of having uniformed security practice to prevent things
like plugging in a hard drive and having it just
(15:23):
to automatically start running scripts. Which is why I say
generally Windows Defender, and I'm specifically highlighting Windows Defender because
that's I'm going to guess ninety nine percent likelihood of
the systems that are being run Windows eight, Windows ten,
possibly Windows eleven. The government generally doesn't do a lot
(15:47):
with open source, although again I think I mentioned this
and the previous episode that I was on the internet,
as we know it is backboned on Linux. But that's
another rant, Getting the data out of the ecosystem it's
currently in doesn't really do a whole lot and isn't
(16:09):
really necessary because if you already have complete control over
your source system, you don't need to make a copy
for an external system. And right now those checks and
balances aren't really in place for anyone to be in
a position to come in and go, hey, long, pack
up your shit and get the hell out.
Speaker 1 (16:26):
Okay, yeah, control of the source system. So let's talk
about that. Here's some reporting from Reuters. The initial story
was January thirty first, with some updates over the last
like twenty four hours. The headline is musk Aids lock
workers out of OPM Computer Systems. Now OPM is the
(16:47):
office of what was Personnel Management. Yes, and this is
a fairly long article, but YadA, YadA, they've set up
like a bunch of futons and shit his whatever his
people are doing. They're in their twenty four hours a
day right now, and they seem to have control over
(17:11):
what would be okay. The systems include a vast database
called Enterprise Human Resources Integration, which contains dates of birth,
social security numbers, appraisals, home home addresses, pay grades and
length of government service, length of service of government workers.
So he is currently having people do something with the
(17:32):
entire list of everybody that works for the federal government
at the very least.
Speaker 2 (17:36):
Okay, So I'm just going to say it because I'm
fairly certain we have a listener screaming at their streaming
device right now with following. Yes, it's HUAC. It's the
twenty first century equivalent of HUAC. Is likely what I
think they are assembling. For those who weren't around in
(17:56):
the nineteen fifties, QAC is hu ACS. An acronymis are
you don't worry about it? Okay, old enough to know?
HUGHAC is an acronym that stands for the House on
American Activities Committee, Yes, very famously headed by Senator Joe
(18:20):
McCarthy during his Red Scare witch hunt, where he basically
initially claimed, without any level of evidence, that he had
a list of and I forget the number, but it
was a staggering number. I think it was like ten
thousand plus of communists that were working within the federal government,
(18:42):
and this provided the bedrock for him to begin investigating
everybody from political opponents to people in the same party
to kind of book into this because I'm trying to
provide a little bit of hope. He was ventually brought
down when he went after Dwight Eisenhower and by the
(19:05):
journalistic activities of Edward R. Murrow basically lambasting him and
effectively calling him out of like come on, get me
in front, get me in front of your panel. I
would love to testify, and McCarthy didn't want to pull
that triggers. So yeah, anyway, bringing it back to Musk
(19:25):
and his cronies, I am fairly certain that whatever they're
doing under cloak of darkness is very likely assembling some
list that we're going to hear a lot about in
the very near future of people who are at worst,
and I'm saying at worst in reality, at worst, political dissidence,
(19:48):
people who don't agree with this regime and who are
likely to not go quietly into that good night exercising
and carrying out the broader objectives and edicts coming out
of the White House, and at best, or sorry, at worst,
(20:11):
probably people who would actively work to slow down and
sabotage I'm going to say the cruel and malicious agenda,
not necessarily with the former where they are likely to
just carry it out ineptly, but actively working to sabotage it,
(20:33):
to do everything in their power to render it toothless
and effective. I'm fully convinced at this point, based on
everything that I have seen and everything that I have read,
that what we will eventually see is basically claims of
an itemized list of everyone in the government that has
(20:57):
worked to subvert it in some way. Though those ways,
mark my words, will be entirely ambiguous and nebulous, without
any measure of concrete or pinpoint examples or evidence. And
this is how they clean house.
Speaker 1 (21:18):
Well, yeah, there's a lot happening in regards so our
federal government at least and how it relates to our technology.
I mean, there are there are crossover roles there. I mean,
the Internet is pretty complicated, but some of it did
start a governmental level, right. And also we have all
these concerns about our data and privacy and can you
(21:41):
plug in a hard drive into a really old, out
of date computer and just have it start copying all
your data? Well, if our government starts running low on employees,
everything's going to suffer from you know it, support and
cybersecurity defenses to whether or not we're oh, I don't know,
fighting off new communicable diseases or getting people their welfare checks.
Speaker 2 (22:04):
Look, I'm just going to say, on the upside, if
you were ever going to cheat on your taxes, this
is probably the year to do it, at best. At worst.
To your point, we have a guy currently and I
don't think he's been confirmed yet, hopefully RFK Junior, sitting
(22:25):
for confirmation for a cabinet position with Health and Human Services.
Not a doctor, not accredited, not qualified, lacking in any
medical training whatsoever. I don't put a lot of faith
in him being competent or capable for the position he's
(22:47):
being considered for. But independent of that, let's examine the
ideas He wants to remove fluoride from water, which has
shown that it actually helps prevent tooth decay. Arguably not
as much these days because we have fluoride and toothpastes
and mouthwashes and everything else. So we're not going to
see as drastic of a benefit program.
Speaker 1 (23:11):
And it's also not hurting anybody exactly.
Speaker 2 (23:15):
It's you know, we're not talking lead in pipes, which
any amount of lead is bad. So you know, he
also wants to roll back vaccines. I also saw something
the other day that I think caused a fair amount
of handwringing, and hopefully, if nothing else has been enough
to disqualify him, this this is or will be. He
(23:40):
actually advocates for different vaccine schedules based on the color
of your skin, basically citing a Mayo Clinic study. I
believe it was a Mao clinic study that essentially highlighted
the difference in immune response from black people versus white
(24:02):
people and kind of how different vaccines can cause different
severity of immune response, and RFK basically saying, well, that's
enough evidence to advocate that white people should have a
different vaccine schedule than black people, which not at all
the point of that study, not at all the takeaway,
(24:22):
not at all what the evidence concludes or supports.
Speaker 1 (24:27):
Happy Black History Month, everybody.
Speaker 2 (24:30):
Yeah, pretty much, it's all depressing.
Speaker 1 (24:35):
His confirmation vote appears to be coming up on this Tuesday,
and so by the time if you're listening to this
when you first wake up on Monday morning, we've only
got about twenty four hours to find out if OURFK
Junior will be our new Secretary of Health and Human Services.
Oh god, if there is.
Speaker 2 (24:54):
A god in heaven, and I'm not talking the Abrahamic God,
just any god, Buddha, Ganesh, Confucius, crowd.
Speaker 1 (25:03):
And what aliens with big eyes. I'm fine with that.
Speaker 2 (25:07):
The crystal skull Ones from that horrible Indiana Jones movie
no one talks about. Just if any higher power, intelligent
creator exists anywhere in this plane of existence or another,
he will not be confirmed. I'm just saying, your move God,
(25:31):
to your point your question of Musk and his cronies
having control of the OPM, computers.
Speaker 1 (25:42):
Databases, everything about our federal governments employees, I.
Speaker 2 (25:47):
Mean, look, it's it's the thing I think I was
saying on the last episode that I was on. Good
and bad does not necessarily mean factual or inaccurate. It's
what you choose to do with it. So Musk and
his cronies having this information, it's more information than I
(26:10):
think anyone currently under the microscope is comfortable with them having.
But ultimately, how much damage is done with that information
will determine whether or not it's good information or bad information.
If he decides that everyone who currently draws a check
from the OPM gets a million dollars, I don't think
(26:33):
they're going to complain too much.
Speaker 1 (26:35):
Sure that that's sure, Yep, yep. Go on.
Speaker 2 (26:39):
If he decides that every one of the people on
those lists is now going to put into a political
dissident camp a la Soviet Gulag, They're probably not going
to be too keen on the fact that their personal
lives were gone through like that.
Speaker 1 (26:54):
So I feel like the most likely thing right now
is that they will be fired more than it into
a prison camp.
Speaker 2 (27:01):
But I think that's how it starts. It's HUAC again.
It's the list of ultimately trying and this leads into
another tech thing. I fully expect there will be AI
coming into effect with whatever information is taken off of
this with the prompt or the question how likely is
(27:24):
a person with this information to become a true believer
to President Kidney Stone and his cause.
Speaker 1 (27:32):
Since you mentioned AI, one of the things that came
out of Trump's first week in office was something called well,
either Project Stargate or like Stargate LLC. I'm not really
sure if there's a difference there. One of them sounds
like a government program, one of them sounds like a
private corporation, but I've I've seen both. Either way, it
(27:54):
seems to be mostly focused around AI, but I don't
I don't know, like dot dot dot, and then what
what is going on with this stargate thing?
Speaker 2 (28:02):
So officially from open AI's announcement post available on their
on their website, and I don't know. I don't think
you do show notes with links, but.
Speaker 1 (28:15):
I'll put links in you want a link to open
ai in the show notes.
Speaker 2 (28:20):
I don't want to, but for people who want to
read instead of getting it regurgitated or paraphrased. They specifically
say that the Stargate project official name Capital AZ Capital
p is a new company which intends to invest five
hundred billion with a B dollars over the next four
(28:42):
years building new AI infrastructure for open ai in the US. Specifically.
My guess is this is a direct response to deep Seek.
Speaker 1 (28:53):
I think this came out. I think the announcement of
Stargate was before we knew about deep Seek. But let's
let's I want to talk about deep Ze two. But
first it just hit my as you were saying, like
infrastructure for AI. I'm pretty sure that right before he
left office, Biden had already signed on to the idea
that we were going to let Microsoft turn like three
(29:16):
Mile Island back on because we need the nuclear power
to help run our AI engines.
Speaker 2 (29:26):
Thanks.
Speaker 1 (29:26):
And video, Yeah, okay, I have so many questions, and
Vidia comes up a lot when talking about AI, and
and video also used well graphics, so in video makes
graphics cards, stuff that in the past was sort of
a gamer centric thing or maybe an art gamers and artists.
(29:48):
We needed video cards in our computers so we could
play the greatest games and we can make cool animations
and special effects and stuff. That has really changed a
lot between crypto and AI.
Speaker 2 (29:58):
So to be clear, I just want to clarify something
because I am nothing if not pedantic. I appreciate that
Nvidia doesn't make graphics cards. Nvidia engineers graphics processing units GPUs.
Companies like Asis and Gigabyte and formerly Evga, those aftermarket
(30:25):
companies create video cards using architecture that was engineered by Nvidia. Again,
I'm nothing if not pedantic, But to your point, yes,
you graphics cards used to be and I can remember
doing this many times over the past few you know,
a few decades. You go to a computer store like
(30:48):
a Fries or a comp Usa and use both of them,
and you go to a particular area of the store
and you pick your components off the shelf, and one
of the components that you could get was a graphics card,
and up until I think it was like ninety eight
ninety nine, you did have a choice between like a
(31:10):
Voodoo card and an ATI card or an n video card.
It used to be that you would get these graphics cards,
you'd put them in and you could play the latest
games Crisis, Doom, Mass Effect, Dragon Age, Sure, Need for Speed,
(31:30):
World of Warcraft, you name it, they do it. Somewhere
along the way about fifteen years ago, a little more
than that, I think it was around twenty ten twenty eleven.
Crypto kind of came about with bitcoin, and what was
found pretty quickly was that graphics cards, because of the
(31:53):
way that their data through put pipelines were structured, were
very efficient at running the calculations necessary to effectively mine bitcoin,
and this was something that made them a bit of
a commodity. Graphics cards still were seen as very much
(32:17):
a niche hobby or a niche commodity, I should say,
with the need for high horse power, graphics cards really
not coming into effect or not being impacted, I should
say in a way that drove prices up. I can
(32:39):
remember in twenty sixteen you could get an Nvidia ten
seventy for about four hundred dollars four hundred and fifty bucks,
which was reasonable. Not the flagship card even then, the
flagship card I think was like eight fifty. It was
substantially more, but it wasn't a tremendous amount more. Really,
(33:02):
where I think the turning point comes about was in
twenty twenty, when we had the release of two highly
graphic intensive games, specifically Doom for the PC, Doom Eternal
and Cyberpunk twenty seventy seven, both of which relied on
(33:22):
ray tracing, which you know, you really couldn't get with
a ten or you couldn't get with a ten seventy
at all. So you needed a two thousand series card,
a twenty fifty or twenty sixty, twenty seventy, twenty eighty,
something like that in order to really experience these games
the way they're meant to be played. That's where I
(33:43):
think the shift in the demand really started to pivot
to end video, looking at what the customer base of
their cards and mass really was, and they capitalized on
it because of the supply chain disruptions as a result
(34:05):
of the pandemic and basically effectively doubled their pricing overnight.
And it has not come down.
Speaker 1 (34:15):
Nothing ever does, But yay capitalism, gas prices, grocery prices.
Nothing's ever really going to come back down.
Speaker 2 (34:23):
It never does.
Speaker 1 (34:25):
So how did this get into AI? The same kind
of idea as with crypto mining. It's like those types
of processors that are great for graphics are also really
good for calculating how they're highly efficient in plain English
or whatever, they're highly efficient. I mean, so this gets
(34:46):
to kind of brass tacks of how does computer processed data?
Speaker 2 (34:50):
And we have CPU cores, central processing units, your Intels,
your AMD's your snap Dragons now, but traditionally it was
Intel AMD. They run on what is colloquially known as
an x eighty six architecture, basically from the old Intel
(35:11):
eighty eighty six and then three eighty six forty six.
It's x eighty six or sixty four bit, and that
instruction set has been around for the better part of
a century and it really hasn't changed very much, but
it's horribly inefficient. It's one of the reasons one of
the reasons why it's so inefficient is because, and a
(35:34):
lot of people don't know this, you could actually run
on a modern day computer. You could go to the
store and buy an off the shelf laptop, bring it home,
and with just the right amount of tinkering, you can
actually run DOS. You could run MS DOS on that
like old MS DOS three point one. You could put
Windows three point one or Windows ninety five. I don't
(35:56):
recommend that you do, but you could because all of
those legacy instruction sets, all of those ancient archaic structures,
still exist within the chips. They're still able to run
all of that stuff. So modern processors, and I'm talking
(36:18):
specifically around the x eighty six x sixty four chip architecture,
not the ARM that's a whole different discussion, they're not
super efficient at the way that they process data, which
is part of the reason why for the past decade
and change, the push for getting better performance out has
(36:39):
been higher clock speeds and more cores because the data
pipelines really haven't seen a major watershed moment of breakthrough
since AMD integrated the north Bridge directly on the north
Bridge controller directly on the chip. For those like me
(37:00):
been around for a long time. If you can remember
setting your front side bus multiplier to have it coincide
with your RAM clock. That used to be a separate
chip on the motherboard that you would have to tweak
just right in order to get the maximum performance out
of your RAM. And that was a whole self contained system.
And AMD basically cracked the code and was able to
(37:23):
get that chip controller integrated into the CPU. So since
then you don't really need a north bridge. It's technically
still there, but you don't. The central processor handles that.
But going back to your immediate question, the central processor
(37:44):
is not super efficient at handling all of these floating
operations like it just it can't handle them the way
that a graphics processing unit can, because the graphics processing
unit is a lot more nimble in the way that
it takes data, it in processes it, and then gives output.
So you have something like an n video card, a
(38:07):
graphics card specifically, that isn't trying to run an operating system.
It isn't trying to handle all of the interrupts and
all of the commits for a keyboard and a mouse
and a monitor and a power supply and a network
card and you know, hard drive speeds and hard drive
data in, hard drive data out, you know, mvmes, all
(38:30):
of this stuff. They're not it's not managing the south
bridge for you know, your peripheral connections, your speakers, you know,
all of these things that the central processor really does
have to keep up with kind of at all times.
The graphics processor theoretically just does that. It processes graphics,
(38:52):
but if you feed it an instruction set that doesn't
require a video rendering output, it can devote all of
its processing capability to crunching on whatever you gave it
without having to render anything on the screen. And that's
part of what makes it so efficient at doing this
crunching is it's effectively a little computer in a bigger
(39:13):
computer able to handle a subset of instructions with exclusive
focus on those things. And so the more graphics cores,
the more clock speed, the more ram the more efficient,
the more powerful that you can make that graphics card,
the better your output. And AI is really great about
that because if you can push all of the workloads
(39:36):
directly to this little card based computer effectively what is
a system on chip, you get really good performance out
of it because again you're not having to deal with
all the extracurricular peripheral overhead. You're just letting it focus
on the data sets you've given it for processing. And
(39:57):
that creates a demand with AI as everyone's trying to
use it. And I think what we've really seen within
the global market, the world markets overall, not the stores,
but global markets, is people are in search of a
use case. Yes, you can have AI transcribe a meeting
(40:20):
or give you cliffs notes, or look at a resume,
or build you a table or output some code. Those
are great and they're helpful. They do what technology is
supposed to do, make arduous tasks easier. Yeah, and I
actually like AI for some of.
Speaker 1 (40:36):
The I've used AI for some of those things. But then, well,
what's the flip side, as you were about to say, right.
Speaker 2 (40:42):
Well, the flip side is that helps you and I
as the individuals. That helps you be more efficient and
more effective at your job. If you have a supervisor
that wants a report of how many computers are doing
X or how many attorneys are doing blah, you can
write that and you can get that done with a query,
and that's fairly simple. But generally the business can't make
(41:05):
money from that. There's no product that comes from that.
The fact that you are able to more efficiently do
your job ultimately freeze you up theoretically to do more work,
which is good for the bottom line. Yeah, but it
doesn't carry with it a net positive for the amount
(41:25):
of investment that goes into standing up this hardware infrastructure
for you to be able to punch a query in
and get a response back. And so this is where
I think, by and large, we are looking for and
by we, I don't mean you, and I I mean
again global companies, enterprises are looking for where is the ROI,
(41:49):
where is the return on investment? And I think that's
where we see the headlines that perpetually go back to
Coders are going to get replaced, Secretaries are going to
get replaced, all these different jobs that they think can
be operationalized into an AI prompt to get rid of
what is traditionally overhead folks. That's where the value proposition
(42:13):
is being focused is you don't need coders, you don't
need a development team. All you need is a high
schooler who knows his way around the keyboard that can
copy and paste, and you have everything you need. I
eat elon Musk hiring a kid fresh out of high
school to basically help with his let's call them endeavors.
Speaker 1 (42:35):
So is some projects stargate then or stargate LLC, Like,
that's going to be infrastructure behind AI, which is why
these companies can work together, because they're all looking for
ways to get more electricity. But that comes back to
the so to sort of tile these together. All these
video processors that are being bought to process the both
(43:01):
trained the model, but also process or output. Those are
using significant amounts of electricity well with it generating significant
amount of heat.
Speaker 2 (43:11):
Let's be clear, every war that has ever been fought
in the history of mankind, from the earliest days of
us launching rocks at each other, to know the far
flung two weeks from now, when the nuclear missiles are
flying in the sky headed for major metropolitan centers.
Speaker 1 (43:29):
Two weeks from now, did you say.
Speaker 2 (43:32):
Don't quote me on that. God, I hope I'm wrong.
They have all been fought over one thing, who controls resources.
It's all resource wars. And to your point, these graphics cards,
as efficient as they are, they are incredibly power hungry.
Speaker 1 (43:50):
Yeah.
Speaker 2 (43:51):
I don't know if you've looked at the specs for
the five thousand series and video cards, but they are ridiculous.
To run an entire Blade five of five thousand series
graphics cards in a data center, you need almost a
megawatt of power, right, which is where we get to
(44:13):
three Mile Island being stood back up and with the
increased demand for AI, you've got to have a way
to power these things.
Speaker 1 (44:23):
I would like to give a little context, and we
could use a laugh. So this goes back to twenty
twenty one, our almost Attorney General Matt Gates started. This
is when he started getting caught up in this controversy
about how he was traffic under trafficking underage girls across
state lines, having sex with underage girls, paying them for sex,
(44:45):
using Venmo for it. Holy shit, talk about a data
privacy problem. But it wasn't just him. He was not
alone in those endeavors. He had a friend named Joel Greenberg.
Does that name sound familiar to you the icaas.
Speaker 2 (45:01):
A little bit.
Speaker 1 (45:02):
Yeah. So he was this Florida, I don't know, attorney
or something.
Speaker 2 (45:07):
Ex collector in Seminole County.
Speaker 1 (45:10):
In twenty nineteen, Greenberg set up a company that would
allow taxpayers to pay their bills with cryptocurrency. No payments
wherever processed through the company. About ninety thousand dollars in
public funds was used to install server equipment in a
room in Greenberg's private office that only the tax collector
and a vendor had keyp had access to. The equipment
(45:32):
was moved last year, so I guess that would have
been twenty twenty to a branch office, where it caused
a power surge, creating a fire that resulted in sixty
seven hundred dollars in damage that wasn't covered by insurance.
These things get hot, they need a lot of power,
and you need a place to host them that can
handle that kind of power input and output and heat.
(45:52):
And when you just set it up in a closet
in an office, you could burn down the building. Yes,
so five hundred billion dollars is going to go into
trying to solve this problem. And where I'm sitting, as
you know, kind of a tech guy but mostly a
lay person, is like, what will we get out of this?
(46:14):
Because it doesn't seem like AI is really living up
to its promises. But then there's also the sci fi
part of me that's like, well, yeah, someday we will
be able to like say, computer, make me a sandwich,
and the computer will just make you a sandwich. And
that's a long, long ways off. But do we have
to go through this trillions of dollars of agony of
(46:35):
like letting these crappy AI companies do their stuff in
order for us to get to that better you know,
jets in Star Trek future.
Speaker 2 (46:46):
So let's be clear. We live in a capitalist society.
We live in a world predicated on capitalism. So the
number one question anytime that you run into something like
this is is how do you monetize it? How do
you make money from it? It's one of the questions
I get every time I tell someone about the fetiverse.
(47:08):
You can just sign up for Lemmy. It doesn't cost
you anything. You can host it yourself. It's nothing but
open source engineers putting their code out there for you
to use and abusees you choose. And the very first
question I get is, well, how do people make money
from it? That's the same question with AI, how do
you make money from it? It's the same thing I
(47:30):
was just talking about. It's how do you monetize it?
So to your question, must we go through the infant
stumbling in order to get to the Rosie, the robot
or the rooty too. To build off of your Jetson's analogy. Yeah,
you know, how do we get from the AI telling
(47:53):
you to put super glue on your pizza and drink
bleach to kill COVID? Oh wait, that was the president?
How do we get from there to working three days
a week, four hours a day and literally all we
have to do is press a button. My answer to
(48:14):
that is there are basically two paths that we can
go down. The first is the responsible gradual iteration, where
we develop responsibly, intelligently. We iterate carefully, we learn from
mistakes of previous iterations. We go boldly, but we are
(48:40):
reserved about how much power we give the machine. The
challenge with that route is you can't make money quickly
because slow iteration doesn't get people excited, it doesn't get investment,
it doesn't get people wanting to give you money to
(49:03):
develop something. If the product that you are going that
they are investing in, that you are going to develop
is going to take three years, five years, ten years
between iterations because you're being responsible and cautious about.
Speaker 1 (49:22):
It, boring well.
Speaker 2 (49:25):
Facebook used to have an axiom internally called move fast
and break things.
Speaker 1 (49:31):
Yeah, like Google used to have an axiom that said,
you know, do no evil.
Speaker 2 (49:36):
Don't be able to change well, And I specifically highlight
Facebook's axiom because that I feel like has become the pervading,
all encompassing guiding light for a lot of the tech
companies now is who gives a shit? Would we break
as long as we can deliver something to the investors
(49:56):
and it doesn't even have to work. It's Homer's car.
But where in that episode his brother was ruined because
people saw the car and thought, what the hell? We
no longer have savvy investors, We no longer have people
holding the purse strings who are cautious about stuff because
(50:20):
we have this culti personality around the tech bros. Zuckerberg, Mosque, Bezos, Altman,
Sam Bankmin freed before he went to jail because there
is this cult of personality. Fucking Max Screlly, for God's sake.
Speaker 1 (50:36):
Well, no one ever liked that guy, did they?
Speaker 2 (50:39):
They didn't like him, but they gave him millions of dollars.
Remember he still owns owns a Wu Tang Clan album
that only he gets to decide what gets done with it.
Speaker 1 (50:51):
Hey, that was the original NFT exactly.
Speaker 2 (50:55):
But because there is this culture personality around the tech bro,
what ends up happening is we're twenty years ago. Jeff
Bezos was constantly having investor calls of like, dude, when
am I going to see return on investment? When am
I going to get a dividend check? When is the
stock going to be in the green? What the hell
(51:16):
is going on? Why should I give you another one
hundred million dollars? And Bezos had to make that case
for like, you should do it because this it's going
to pay off, and then of course we all know it,
did ye. Now the investor doesn't ask that question. It's
(51:36):
the tech bro walking in in flip flops and a
hoodie and saying I need fifty million dollars to develop
an AI startup. And the only question is how is
your AI startup going to be different from anything else?
And if you spit out enough tech jargon that goes
right over their head, they will dig out their checkbook
and write you for a hundred million because they want
to be first to the table.
Speaker 1 (51:58):
So why are we doing that? I could use the.
Speaker 2 (52:02):
Money because we have scruples.
Speaker 1 (52:04):
Ah damn it? All right?
Speaker 2 (52:08):
Well, So to bring it back to your question, Yeah,
of you know, must we have the stumbling baby before
we get rosy. We don't have to. We can be
responsible about it. But what is the actual use case
(52:31):
for AI? AI used responsibly streamlines the jobs of people
like you, people like me, people like your listeners, to
be able to do eight hours of work in five.
In an adjust world, that extra three hours would be
your time to do with as you please. Go to
(52:51):
the beach, go bowling, read a book, learn a new skill,
pick up the guitar, learn to paint, spend time with
your kids, raise a family, watch a movie, right, you know, poetry.
You could use that time for self enrichment, which is
what we saw around the time of the Enlightenment. People
(53:11):
didn't just decide that they were going to start painting
and reading and writing and telling great stories and doing
all these things that we associate with the Enlightenment. That
wasn't just a people woke up one day and that
was the work that had previously taken all of the
time and all of the energy to get done, leaving
(53:31):
people with no resources personal resources to do any of
this self enrichment. The technology had reached a point where
that work was now streamlined, and so people had all
this free time. People were able to devote time and
effort and energy to self enrichment, to painting, to writing,
to sculpting, to making music, writing symphonies, you know, all
(53:53):
of this great stuff. But that doesn't align itself with
the American way of life and today's day and age,
because everything has to be monetized. And so to your
question of what is the end goal, we have AI,
and then what AI is not for me and you?
(54:13):
And that really is kind of the dirty little secret
in all of this. AI is not intended for me
and you. We can use AI, but we can use
AI to build better models so that we can be
advertised to We can use AI to free us up
(54:34):
to do more work in the same number of hours
or in more hours, to increase value for the corporations
and the enterprises so that they can get better investments
so the stock price goes up. It's not intended to
make my life easier, your life easier, or anyone listening's
(54:54):
life easier. It's intended to make money for the people
who throw cash into it now knowing that what they're
going to get on the back end is better ad targeting,
better corporate contracts, better corporate leverage. They're going to have
more people investing, more money, driving profits, making line go up,
(55:16):
and in a capitalist society that's all that matters. Line
go up.
Speaker 1 (55:22):
Well, that's going to be very interesting in the weeks
and months to come, just broadly speaking from an economic standpoint,
as you know, terrorists and stuff get enacted. But but
I want to stay on the AI topic because the
other big headline from the last week is the unveiling
of this thing called deep Seek, which I, if I've
(55:45):
got my information right, is sort of the opposite of
what we have where open AI needs, you know, half
a trillion dollars worth of infrastructure to be able to
run servers that can you know, help me write a
song that sounds like Sharon. Deep Seek claims at least
to be able to run locally on someone's PC and
(56:07):
provide the same quality of output. But you're holding up
a very small computer in front of the camera. Is
that a raspberry pie or something.
Speaker 2 (56:17):
It's a raspberry pie. It's a Raspberry pie four. But
to illustrate your.
Speaker 1 (56:21):
Point, yeah, you out of your house, you could have
it generate the same type of output that open ai
is generating right now. Chat GPT is generating right now,
and for a fraction of the cost. And it only
took like they say, you know whatever, but they say
it took what like ten million dollars twenty million dollars
(56:42):
or something of that to develop it, where chat gpt
is already in the tens of billions. What is the
reality behind deep seek? Is it just? Is it? Like?
Is it like when Amazon said, will scan your food
when you walk out of the store and it turned
out it was a thousand Indian guys in a trench coat?
(57:03):
Is that is just deep Seek? Just a guy sitting
back there typing rapidly all the stuff and pretending like
he's a computer, real mechanical turk kind of thing.
Speaker 2 (57:12):
So, when you're as deep in the trenches about technology
as I am, the proposition of AI running from your
desktop that's not awe inspiring because that's something that's been
around for the better part of the past five years.
There are open source models things like hugging Face is
(57:35):
a marketplace essentially, and you can run a desktop model.
There's a program for Windows AI Studio I think is
what it's called, where you can literally just go through
a marketplace and you can download free open source models
of LLMS, things like Mistral, things like Gemini, things like
(58:03):
Barred Grock. You can down you can download open source
versions of these models. You can load them into the software,
and you can run them on your own hardware.
Speaker 1 (58:15):
And deep seek is open source right like it's on
GitHub right now. If you have the know how, you
could download it, compile it, and run it on your
own machine.
Speaker 2 (58:24):
Right. So, to answer your question, I don't think it's
a thousand Indian guys in a trench code. I don't
think it's one. I don't think it's the bartender from
who framed Roger Rabbit. I don't think it's Max Rebo,
you know, with four lems. I genuinely think that deep
Seek effectively boils down to a hyper efficient iteration of
(58:50):
one of these open source models, basically put put together
in such a way that again it's hyper efficient. Desktop
processing for llms is the thing that's been around for
a little while. I've run it myself. I have some
hardware that's running effectively Desktop AI and it runs okay.
(59:12):
I don't use it for any you know, DOLLI processing
or any of that stuff. But I can ask it
basic basic questions. I can have it, look at something,
I can have it, right stories, I can have it,
you know, give me directions, find recipes, things like that,
and it never goes to the Internet. It just runs
with the model that's currently on and it can iterate
(59:34):
based off of that. So something like deep seek really
just seems like a natural evolution of this non enterprise backbone,
you know, And ultimately that's what we want. Again to
my previous point, AI is not for me and you.
We're supposed to use it, but we're not supposed to
(59:56):
benefit from it. But if we can use it on
our day stops, in our everyday life without being reliant
on subscription fees or anything like that, why not. But
then you get into the whole self hosting thing and
people feeling like they need a computer science degree in
order to launch things.
Speaker 1 (01:00:16):
So I attended, as part of my day job a
couple seminars this week, and there was a fair amount
of cybersecurity talk and deep zek being so brand new,
one of the particular speakers was not fully prepared to
talk about it, but the initial impression was like, sure,
(01:00:36):
deep zeke might be a faster and much much cheaper
way to get your AI work done. But China, and
I assume that there will be a lot of analysis,
especially in the hacker community, of the open source code
for deep zec that's already been put out there, to
(01:00:58):
make sure that they're isn't some guy somewhere in China
that's going to be taking all of our stuff and
then using it for profit or evil? I don't know,
But I since you are in that community yourself, where
do you see the line drawn there between? Like what
(01:01:21):
we know, open ai has been doing a chat GBT
trawling through the internet, taking all the data it can
without asking, first training AI model and now charging people
to use, you know, something that was built on our
public data, and we don't know what their endgame is really.
(01:01:42):
But versus if that same concept is being hosted in
the People's Republic of China, is there some reason why I,
just a guy, should be worried about the China of
it all when deciding if I was going to use
one eye engine versus another.
Speaker 2 (01:02:04):
So I'm gonna answer that question by asking you a question,
why is China bad?
Speaker 1 (01:02:09):
Wow? Because communism? Well, look, I mean, I think there
are fair criticisms of China. You know you mentioned their
Belton Road was that what it's called Belton Road program? Like,
they are making contracts with poor nations, and they're very
advantageous to China, more from a capitalist perspective than a
(01:02:30):
communist perspective.
Speaker 2 (01:02:32):
Oh, where have I seen that playbook before?
Speaker 1 (01:02:33):
Right? But I mean there, But to that end, are
they any better than us? Probably not. They are a
big economic powerhouse in the world. They do have someone
in place of the presidency that may be sort of dictatorial.
We hear stories about, you know, reporters or other political
dissidents going missing, but we don't actually know. I mean,
(01:02:57):
that's just what we hear on our side. So I
don't know. I don't know what's so bad about China,
I guess, but we're told they're bad.
Speaker 2 (01:03:04):
We're also told that Jeffrey Epstein hung himself. We're also
told that Jimmy Hoffa just magically disappeared. We're also told
that Lee Harvey Oswald managed to shoot the president in
open air with a bolt action rifle and got off
I think it's three or four shots, and that there's
(01:03:24):
a magic bullet that somehow managed to cause the amount
of damage it that we're fat a lot of things,
and we kind of accept it just on basis of
it seems plausible and we don't want to think too
much about it. Does the state of China have a
spotty record? I would say no more and no less
(01:03:46):
than the US as a global entity. I mean, again,
not to get off on a history lesson, but the
country of Iran had a democraticly elected leader, and under
Dwight Eisenhower, the CIA deposed him to reinstate the SHAW
(01:04:07):
because there were concerns with the business interests within the
US that that democratic leader was going to be unsympathetic
to US business interrists and the SHAW was going to
be more sympathetic and more American centric.
Speaker 1 (01:04:26):
A lot of stories like that, especially in Central and
South America. The CIA has done a lot. Yeah, right,
So America has often not been honest with its people
about why we are taking certain stances or god forbid
actions in the geopolitical sphere. So it sounds like your
answer to my question is pretty much the same as
(01:04:47):
my question in the first place, which is just a
giant shrug emoji, like, is Deep Seek dangerous because it's
coming out of China? Your take would be on its face,
probably not any more dangerous than anything else.
Speaker 2 (01:05:00):
It's open source, which means that if you are so inclined,
you can audit the code. Now here's the thing. Open
source doesn't inherently mean more or less secure, or more
or less prone to malicious actors or malicious intent or
exploitation or error or hack or anything else. It just
means that you can look at the code and see
everything it's doing. Think of it this way. You work
(01:05:23):
it for a living. If I hand you a PowerShell
script and I hand you the dot ps one file,
you can open that up and you can see everything
that it's doing, all the calls it's making, the modules
it's loading, what it's looking at, the directories it's operating in.
You can see all of that, right yep. So if
I hand you a flash driver, I send you an
(01:05:45):
email with a dot ps one attached to it, and
you download that and you open it up, you can
see everything that's doing. And if I tell you run
that file, you feel reasonably comfortable doing that because you're
able to see what that script is doing. That's the
beauty of open source. You can see everything that it's doing.
(01:06:07):
You can tweak that code to make it do something
different if you're not comfortable with it. If I have
using that example again, if I send you that ps
one file and it is writing a log file to
a directory that you're not comfortable with, you can edit
that file and tell it not to do that, or
(01:06:27):
tell it to write to a different directory. Same thing
with open source.
Speaker 1 (01:06:32):
Well, right, I could make a change if there was
something if I was skilled enough to read the code
and there was something that made me uncomfortable, I could
still benefit from the rest of the work, but make
a change to the thing that I didn't like a
log file for instance, exactly. But to that end, though,
once deep seeks well it is out there, so someone
(01:06:54):
could take deep Sea's code and insert something nefarious into it.
We could scan what's already been released for a nefarious code,
but we could also add nefarious code to it, So
it could be a real minefield of deep Seek products
out there could be.
Speaker 2 (01:07:10):
But again you've got to look at the way that
version and control is done within gethub, specifically any open
source code repository, source Forge, gethub, any of these. The
way that they're predicated is you start a project. In
this case, we'll call it two Fjef's AI and you
(01:07:31):
take all the code from deep seek and you copy
it line for line, you copy it and you republish it,
and that's your project. And then you realize, wait a minute,
now I want to make a tweak. You can't post
that tweak or even your own code back to the
deep seek GitHub repository. You can't make changes unless you
(01:07:51):
are given permissions to write commits to that code. You
have to write commits to your own project. Now where
this gets into is six months down the line, when
it's you and one F Jeff and Aaron and ten
listeners have gone through and pull out the code in
a little tweaks, adding, you know, comment lines of you know,
(01:08:16):
Aaron's the best, Jeff, you know two F Jeff is amazing?
What the hell one F Jeff? Whatever, you've made, those changes,
those are committed to your gethub repository, to your project.
They're not committed to the main line project. So to
specifically tie it back to your question, yes, potentially there
(01:08:38):
is the opportunity for malicious engineering. And that's that's a
thing that you know you should be worried about. But
if you're going back to the official repository.
Speaker 1 (01:08:51):
You know.
Speaker 2 (01:08:51):
I'll take, for instance, something like proton Mail, which I
talked about in the last episode and a little bit
of a touchy subject because Andy got into a little
bit of hot water on his welcoming the new administration
and there's some questions there. But independent of all of that,
the proton mail app for Android and iOS is open source.
(01:09:17):
You can go to gethub right now. You can look
at it, you can see it, you can audit the
code yourself. If you want to make changes to it,
you can, but it won't connect to their servers. Now,
what we're talking about with deep seek is you're not
connecting to a third party resource, to a server. You're
just running it locally on your system, so you can
make any changes you want to good, bad, inoperable, malicious,
(01:09:41):
doesn't matter. You're free to make those changes once it's
on your system, but you can't commit those changes back
to the official page. So is it something we should
worry about. Absolutely. Just again, just because it's open source
doesn't mean that it's free of malicious input. It just
means that you can see everything that it's doing. Going
back to the PS one example I gave you, if
(01:10:03):
I tell the thing to write a log file and
then delete everything in your system thirty two file that's
a malicious code, that's you know, malicious code, that's a
malicious script. But you can see it, do that and
you can remove it.
Speaker 1 (01:10:16):
Yeah, I know you, I know you have to go soon.
And we've already been going for a while. So the
last topic that sort of relates to both of these
is our last episode was called rec Room for a TikTok,
and we put it out on the day TikTok was
going dark, and within something like fourteen hours, TikTok was
back with a message that said, hey, hey, President Trump
(01:10:40):
saved us, even though he wasn't even president yet. But
all this was was I think a seventy five day
stay of execution that the president was allowed to invote.
Biden could have done it, but he didn't feel like
it because because he's Biden. Since TikTok came back first
(01:11:01):
of all, some of us that use it have found
it to be kind of different. Suddenly. The big conspiracy
theories floating around it are that suddenly Facebook and Instagram
and metaquest all had official accounts with hundreds of thousands
of followers that weren't there before. And I did start
(01:11:22):
getting a lot of Facebook like ads in my ad
feeds that were never there before. And people started finding
that they weren't getting like their algorithm was different. They
weren't getting let's say, if you were in if you
were a leftist, if you were getting a lot of
videos about let's say genocide and Gaza, those dried up.
(01:11:48):
And then to accentuate the conspiracy theory, this is what
people are saying. If you block Facebook and metaquest and Instagram,
all of a sudden, like your feed readloads and you
start getting all the stuff that you used to get,
Like it's kind of crazy, but also I was kind
of experiencing a little bit of that myself. And TikTok
(01:12:08):
seems okay right now after blocking all those things. But
TikTok is still being pushed to sell, and it looks
like Trump is trying to encourage a sale to Larry
Ellison of Oracle. I don't know Oracle the company or
Larry Ellison the person. I'm not sure where that's going.
(01:12:30):
But he is not one of those guys that you
hear a lot about. He's not a Zuckerberg or a Musk.
Why does he want.
Speaker 2 (01:12:36):
TikTok Again, what's the value proposition information? Well and realistically,
So again let's break it down. So TikTok is unique.
It's effective, It's captured you, it's captured its audience because
of how effective the algorithm is. It's good at serving
you up something that you actually want to watch. So
(01:13:00):
having that algorithm, having that secret sauce, that magic whatever
you want to call it, Having access to that and
being able to leverage that in another product, that's valuable
TikTok on to itself serving you up videos. Who gives
a shit, that's whatever. YouTube does that. The problem with
(01:13:22):
YouTube is that they've done studies and they have found
that if you just let YouTube run long enough, you
can start with you know, how to make meed, and
three and a half hours later you'll have guys giving
you the correct way to do, you know, the Nazi
salute if you just let it auto play long enough.
Speaker 1 (01:13:46):
Yes, that's fun with algorithms. I think I've mentioned that
before on the show, like it's taken people down radicalizing directions,
and I think most of that's because of the way
people use hashtags and sort of you know, optimize their
content for the algorithm right, so that it can eventually
sneak in and get you when you weren't seeing it coming.
Speaker 2 (01:14:07):
Well. But I mean think about it the other way.
Radicalization drives attention, Like the if you're being radicalized, that
sucks and it shouldn't be happening. But it's you engaging
with the software. It keeps your attention. You're doing something
(01:14:27):
with it, which is ultimately what a platform like YouTube
wants because they want to sell you ads. They want
to serve you up ads. That's their business model. They're
an advertising platform, the same way that broadcast television is.
They're advertising platforms. They just serve you content. You get
to watch All in the Family or Step by Step
or Animal Control or Dancing with the Stars or American
(01:14:50):
Idol or whatever garbage TV you're watching. You get to
do that in exchange for watching ten minutes of ads
every half hour. That's that's the trade off. So TikTok
is wanted to be an ad platform to serve you content.
(01:15:11):
You know it's it's an ad platform, and it keeps
you there by serving you content. YouTube the exact same thing.
If it's radicalizing you, you're engaged, and the more engaged
you are. The more you're paying attention, the longer you
use it, the more you're likely to share something with someone.
Oh did you see this crazy thing. The other thing
(01:15:31):
I want to I want to highlight here is we
say radicalization, and I do this. I'm very much a leftist.
We say radicalization, we always mean it in the context
of someone being turned into, you know, a swastika toting Nazi.
You know, someone who cannot wait to get behind the
wheel of a swasti car and swear their undying allegiance
(01:15:54):
to you know, the South African apartheid loving Elon Musk.
But we have to be honest about this. That's the
popular lexicon understanding of what radicalization is. But the truth
of the matter is we are also radicalized to us
(01:16:17):
on the left because we're so abhorrent of that sort
of thing. We despise that kind of thing, and so
seeing it it drives us to action, not with the platform,
but with donating money, with you know, trying to join
protest marches or build community or get connected with different resources.
(01:16:37):
You know, there's a website, everyone should go to it.
There's a website trying to organize a general strike in
the US in the future. They're trying to get three
million people, about a tenth of the US population, which
is about all it would take to get fair workers'
rights established in the country. We can do it, we
(01:17:00):
just have to be willing to do it. That is
radicalization as well. We're not swearing allegiance to a cult leader,
but we are radicalized against things that we do not
like in the same way that the radicalization of people
on the far right is happening, because what's you know,
(01:17:20):
ultimately what's happening. They're being radicalized against things they don't like,
things they don't understand. We understand it, we still don't
like it, and so we're radicalized in the other direction.
But you know, the the algorithm that TikTok is predicated on.
It keeps you watching, and that's really effective for ad delivery,
(01:17:44):
for content engagement. If you want to have you know,
as mentioned something like Facebook or Meta trying to push
content to you from their official feeds, that's a good
way to get you to pay attention to it. Now,
for all the things that kind of come along with it,
I could see something in the algorithm basically saying well,
(01:18:04):
you're subscribed to Facebook, you're subscribed to meta and because
those platforms, much like shitter, much like YouTube, invariably become
vectors and platforms of hate speech, misogyny, sexism, racism, fascism,
(01:18:28):
all of those things come along with it. And that's
where you stop having room for the things happening in Gaza. Now,
the thing's happening in the West Bank, the stuff happening
with immigration and ice raids and all this stuff. You
don't see that because those aren't part of that echo
chamber of the right wing. Because ultimately you can't show progress.
(01:18:51):
It's the Rush Limbaugh problem. You know. In nineteen ninety four,
the Republicans took back control of Congress with their Contract
with America. Nuke Gingrich was elected Speaker of the House,
and he promised to you know, stop overspending and reign
in the Democrats and all of this. But Rush Limbaugh
(01:19:15):
continued promoting on his radio show that America was under attack,
that Conservatives were falling behind, that we had no power,
that there was no voice, that the Democrats were, you know,
causing all of this hardship, and hang up and la
da dah dah dah. And that created an audience that
was hungry for and wanting Fox News when it launched
in ninety six, and so you have this this echo
(01:19:40):
chamber that comes about because it keeps you engaged Facebook.
I think what you were seeing and what others have
seen since this I'm going to call it a relaunch,
is all of the accounts and content that would be
pushed or recommended by these bigger, monolithic platforms Facebook, Shitter,
(01:20:05):
metow whatever those are coming across and silencing out the
things that you actually want to see as a leftist.
And this goes to something that I'm very passionate about.
If you're looking to jump from a TikTok or an
Instagram or a shitter or a Reddit, explore the fetiverse.
(01:20:28):
There is an answer for you whatever it is that
you are interested in. If you want a TikTok alternative.
Loops literally just launched. The Android app launched yesterday. There
is an iOS test flight currently available that you can
sign up for on the website that's Loops dot Video.
(01:20:48):
I have the Android app and it runs flawlessly, and
I've seen some really cool stuff, recordings of the Aurora Borealis,
clips of I forget what show was, but it's clips
of Stephen Fry and another gentleman, and Stephen Fry is
laughing as he's talking to the other gentleman about him
being shagged by a rare parrot.
Speaker 1 (01:21:12):
Okay, I'll well, that's good content.
Speaker 2 (01:21:15):
That is, you can do it with loops like it's there.
It's short form video. It's specifically intended to be an
answer to TikTok in the same way that if you
are an ardent lover and supporter of Instagram, pixel fed
is what you want. It's photo sharing. You can share,
you know, photo an illicit photo of the ceiling of
(01:21:37):
the Sistine Chapel that you should not take. You can
post it on there. You can also take a picture
of your neighborhood, or the skyline, or just you driving somewhere.
You can do that with pixel fed. If you want community,
and you know subreddits, there are communities on lemmy that
(01:21:59):
you can subscribed to for everything from mechanical mechanical keyboards,
to cars, to painting, driving, cooking, you name it. There's
a community for you there well.
Speaker 1 (01:22:14):
And to clarify because the first time I heard you
say this, I thought you were talking about feta cheese.
But this is the FEddi Verse FEEDI V E. R
s E, which has its own Wikipedia article, and there
is a software subcategory so you can see like pixel
feds on their Mastodons on there. They're sort of a
Twitter alternative.
Speaker 2 (01:22:34):
They're more than sort of like what I had mentioned
it on the last episode that I was on. So
blue Sky is what everybody's going to. That's kind of
the big monolithic thing in the sky. And I think
the reason for that is it's easy for people to
understand it. Mastodon being part of the fetiverse. Now you've
(01:22:56):
got this whole thing of like, well what is the fetiverse?
How does stuff come together? And then you've got you know,
the sales pitch line of like it's one account, one
account that you can use for pixel fed and loops
and Lemmy and mast it on and it's super cool
and those are great, but right now there isn't an
app that actually operates in all of those spaces where
you can have a single account. Yes you can, but
(01:23:20):
a single account and a single app that feeds you
all of that content in one place.
Speaker 1 (01:23:27):
You know what that sounds like? That would be That
sounds like that would be an everything app pretty much
now I really want the Feti verse to succeed because
I want them to beat the other guy to it.
Speaker 2 (01:23:39):
Right.
Speaker 1 (01:23:41):
Last question, did you try with all the TikTok controversy?
Did you check out red note the so called Chinese TikTok?
Speaker 2 (01:23:50):
I did not. I looked at a couple of write
ups and posts that some folks throughout Lemmy and mast
it on had made. It looked like it was basically
what the Internet had been in Web one point zero,
so prior to these giants kind of coming together, it
was the old kind of AOL chat rooms, just people
(01:24:12):
coming together and talking about stuff and you know, just
kind of freely exchanging information, which you know, again it's
that cliche line. You know, there's more that unites us.
We're all just people. We're all just curious. I hope
some of us are curious.
Speaker 1 (01:24:28):
You know.
Speaker 2 (01:24:28):
It's the the Ted la the Walt Walt Whitman line
delivered by Ted Lasso. Be curious, not judgmental. And I
think if we had more curiosity as people, as individuals,
as communities, we'd have a lot fewer walls in our
world and a lot less to be scared of.
Speaker 1 (01:24:49):
That seems to be the big thing that came out
of Red Note was that people started having back and
forth video conversations with you know, Chinese citizens and finding out,
of course, we're all humans and we have a lot
of things in common, our wants and needs and what
entertains us and all that, and also we both have
interesting ideas about the other country that, you know, maybe
(01:25:14):
maybe there are some good walls that could be broken
down there.
Speaker 2 (01:25:17):
Well, it goes back to the question I asked you
a little bit ago, why is China bad?
Speaker 1 (01:25:21):
Right?
Speaker 2 (01:25:22):
Don't It's not because of the people that live there.
It's the government and the policies and the decisions that
they make. The same way if you ask the question
in reverse, why is the US good? It's not inherent
inherently good just because we're the country of life, liberty,
and the pursuit of happiness or any other of these,
(01:25:43):
you know, things that we're conditioned to believe. We're good
because of the people. We're bad because of the policies
and the way we treat other people in the world.
China isn't inherently bad. They're good because of their people,
because you know, of how open they are and how
free we're able to exchange information. We're good for the
(01:26:06):
same reasons, we're bad for the same reasons. You know.
I think we've kind of highlighted that a couple of
different times here, and when it comes down to it,
we ultimately are just kind of curious because we don't
have let me put it a different way thirty years
ago when the Long September happened for those people who
(01:26:29):
were big on Usenet during that time in the early
early nineties, thirty years ago when AOL made its way
into every home in America and Earthlink was taking off
and everybody got a computer and was getting online. The
promise that we were sold from this technology was that
we were going to be connected in a way that
(01:26:51):
had never happened before. We were going to be on
the information super Highway. And again it goes back to
what is information onto itself. It's neither good nor bad.
It's just information. And the whole promise that we were
given was that this was going to be a way
to connect with not just our neighbors and our friends
(01:27:13):
and our distant relatives, but with people around the world.
We were going to be able to have conversations with
people in the UK and in India, and in the
Middle East, and we were going to be this global
community coming together. Yes, you know, it was the first
step toward reaching a level one civilization. And now, by
(01:27:36):
and large, because of corporate greed, because of capitalism, because
of monolithic providers and platforms, we have become a factionalized,
tribalized echo chamber loving people who find our little spot
on the Internet and refuse to leave it because we've
(01:27:56):
been conditioned to be afraid of what might be on
the outside of our little walled garden. I'm reminded of
a Mark Twain quote, and I'm going to butcher this,
but I will do my best to paraphrase it.
Speaker 1 (01:28:11):
There's a sucker Bard every minute. No that's not Mark Twain,
that was P. T.
Speaker 2 (01:28:14):
Barnum. There is nothing so detrimental to bigotedness, closed mindedness,
and racism as travel. Oh yeah, and I think that
still holds true. But you don't have to get on
a plane anymore. You don't have to travel ten cities
(01:28:37):
over or go to another state to see a different
way of life. We have computers in our pockets that
connect us to the entirety of the human experience, with
apps like loops and pixel fed, and you know the
old standard bearers, things like Instagram and its first generation Flicker,
(01:28:59):
which is still kicking around. You know, Google Photos, Reddit,
dig you know all of these platforms that people go
and post content to and share some part of themselves,
some small sliver of their slice of the universe, putting
it out there for the consumption and the enlightenment of
(01:29:21):
everybody else. What do I think the value proposition of
AI should be? I said it before. It should be
getting us an extra three or four hours every day.
We do the same amount of output and productivity, but
we get a half a day back to explore the
world around us, Go to a museum, have a conversation
(01:29:42):
with a stranger, engage you know, Oh you're on red note.
What's your favorite food? I don't know what that is?
How do you make that? What's your favorite food? What
do you do for fun? Do you guys have shopping malls?
Learn about the world.
Speaker 1 (01:29:57):
That should be what the American dream really is now.
Speaker 2 (01:30:00):
It should be in this day and age where we
have more access to more information than ever before. We
have the ability to connect and interface in ways that
one hundred years ago would have been unheard of, not
even dreamt of. We should be leveraging that to come
closer together and deliver on the promise that was made
(01:30:21):
to us and given to us thirty years ago, thirty
plus years ago, not to make the world smaller, but
to make it bigger and open it up for everybody.
You know, we have a litany of challenges facing us
in this new millennium. Climate change, war, poverty, death, disease, famine, hunger.
(01:30:45):
All of these things we can fix most of them
if we wanted to. We could just align the resources
and actively work towards solutions. But the only way that
the human race, I'm talking about the entirety of us
as a species, the only way that we will survive
most of the calamities facing us right now, is if
(01:31:09):
we work together. And this isn't my call for like
let's get kumbai ah, but we have to be able
to work together. We have to be able to empathize
with one another and see different perspectives and step out
of the echo chamber and start working together if we
(01:31:30):
have any hope of survival.
Speaker 1 (01:31:32):
Look, I agree, inspiring words. Hopefully we're all taking that
to heart as these next months or years go on,
because we are going to need more of that first
within this particular country of America, though some of you
may be listening from outside America. I actually don't know,
and I should find out. But yes, we need more togetherness,
(01:31:58):
and I hope that we can utilize some of these
lines of communication appropriately to get to that end. Hey,
you know what, if you are listening from another country,
drop me a line suggested articles podcast at gmail dot com.
I haven't said that in a while. I don't think.
If there's one thing I'm really bad at, it's promotion. Well,
(01:32:19):
neon Chaos, thank you again for joining me today. That
was a lot of fun, a little more technical than
our usual episode. But we'll get the other guys back
here one of these days and then we'll just, you know,
we'll get drunk and talk about boobs or something. I
don't know.
Speaker 2 (01:32:38):
Boobs.
Speaker 1 (01:32:40):
Well, you can come talk about booze with us sometime
if you want. There's certainly more things to come. I
almost had a big announcement for this episode, but I
don't want to do that without other Jeff here, So
there's a lot to come on the horizon from us here.
Don't forget. Also, if if you haven't yet to get
a copy of Fane. Have you checked out Fane Neon Chaos?
(01:33:02):
The book by book daddy himself, Aaron Randolph.
Speaker 2 (01:33:07):
It is currently sitting on my e reader. I have
just not had an opportunity to get into it yet,
you know, because there's been a lot to read.
Speaker 1 (01:33:16):
That's true, what's going on in the world of your
own mental health. I told this to my mom recently
because she was having a little minor breakdown about all
the stuff coming out of the news. Sometimes you got
to unplug a little bit, give yourself a break, you know,
check out something different. Yes, Faine is a great sci
fi book, and Aaron Randolph will be on our show.
It could be next episode, But some of that's going
(01:33:39):
to depend on other factors, like why I'm alone here
today talking to Neon Kaos. So we'll see how that goes,
but very very very soon. So that is maybe your
final warning. Get thing going, because it's a great book
and we would love to hear opinions on it, but
also we're going to be talking about it here on
the show.
Speaker 2 (01:33:57):
I will try to buzz through it as quickly as
I can. I did want to give you one really quickly,
and I have seen this a number of places, and
I smile every time I see it. And I will
preface this by saying I'm not a card carrying atheist,
but if there was such a thing, I would be one.
(01:34:18):
There is a bumper sticker that is going around, and
if you don't have one, get one. You'll love this.
People are kind of tongue in cheeks saying pray for Trump,
and specifically, underneath it, they're referencing Psalm one oh nine,
verse eight or Chapter one oh nine, verse eight. Okay,
(01:34:42):
and for those who haven't picked up a Bible in
a while, that line specifically says, let his days be few,
let another take his office.
Speaker 1 (01:34:51):
Yeah, I thought you were going there with that. That's funny.
Who knows. Anything's possible. Two weeks from now, it could
be a completely different administration. You never know.
Speaker 2 (01:35:05):
If my prediction is right, we'll all be waiting for
you know, nuclear.
Speaker 1 (01:35:08):
Fire, but oh yeah, or we could have all it
could be the fallout universe by the time we do
our next episode. Yeah, yeah, anything's possible. But until then,
I'm going to keep trying to do what I do,
which is a mixture of some current events and also
you know, trying to find some stuff to entertain me,
and that includes the book Thane and plenty of other things.
(01:35:32):
But in the end, a lot of it comes back
to what does the algorithm have to say about all this?
So I'm going to keep watching that too, And I
know you're not one to say it, but I'm going
to say it as we end out this episode, the
ultimate prayer for a non atheist like myself, All Hail
(01:35:52):
the Algorithm.