Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
From UFOs to psychic powers and government conspiracies. History is
riddled with unexplained events. You can turn back now or
learn this stuff they don't want you to know. A
production of iHeartRadio.
Speaker 2 (00:27):
Hello, welcome back to the show. My name is Matt,
my name is Noel.
Speaker 3 (00:30):
They call me Ben. We're joined as always with our
super producer Dylan the Tennessee pal Fagan. Most importantly, you
are here. That makes this the stuff they don't want
you to know. Thank you, as always, fellow conspiracy realists
for joining us. If you are hearing our strange news.
We'd also like to welcome you to November tenth, twenty
(00:55):
twenty five Procedural new from Tennessee. We are recording reading
this on Wednesday, November fifth, so a lot of stuff
might change.
Speaker 2 (01:06):
Sure, And November fifth the day we're in right this moment,
it's the morning after a big election here in the US,
and uh yeah.
Speaker 4 (01:16):
You're you're myleogs may vary how you're feeling about it,
but some people are quite thrilled and some people are
doom saying.
Speaker 3 (01:24):
Yeah, way yeah. And congratulations to Virginia for the collective action.
Congratulations to everybody who took the time to vote. Now,
did you guys, are you guys people who vote early
or do you like to go day up?
Speaker 2 (01:40):
I'm like, I'm like morning after. I like to cast one.
Speaker 3 (01:46):
Yeah, you like to show up to the library and say,
why isn't there a line?
Speaker 2 (01:50):
Yeah, what's going on?
Speaker 4 (01:51):
Oh no, it's the old spirit of the staircase conundrum.
Now I love it. Yeah, I went to my local
elementary school right my Neighborhoo. I didn't even have to
get on the main road and went around four o'clock
and not a line to speak of. Very very smooth.
Speaker 2 (02:07):
It's something I've been noticing every election. On election day,
there doesn't seem to be many people out because so
many people are making use of those early voting things,
which is.
Speaker 4 (02:15):
Not a popular measure for the Republicans, if I'm not mistaken.
They are very against mail in ballots and early voting,
arguing that it is a system that is rife with fraud,
though I don't know that I've seen any compelling evidence
to that.
Speaker 3 (02:31):
Yeah, it's it's cool to vote early. It's cool to
vote the date of I like it. I like walking
up in the line. Not to be too Larry David
about it. I know it's technically here in our fair
state of Georgia is technically illegal to bring snacks and water,
(02:52):
but I dig it. Everybody in Atlanta at least is
vaguely related. We talked about that with our pal Michael
render Killer Mike. Uh some of us might know him
as Uh. So we hope you had the opportunity to vote,
and we are using the word opportunity correctly. So, uh folks, Uh, look,
(03:14):
stuff is wild. We're hurtling headlong forward twenty twenty six.
We got a lot of strange news to get to.
Where should we start?
Speaker 2 (03:23):
We got robots.
Speaker 4 (03:26):
I love a robot bee Boop meet meat more Yep,
very much.
Speaker 2 (03:31):
So we have robots. Let's talk robots. After a break
and we've returned. Guess what, guys what Elon Musk? That's
it from.
Speaker 3 (03:49):
Earlier, from earlier.
Speaker 4 (03:51):
Yeah, yeah, yeah, yeah, I love he's got good vibes.
Speaker 2 (03:55):
Yeah. Well, okay, so what's that's for shock value? We're
gonna talk about Ela Musk and Tesla and all the
various companies that are run by Elon Musk that are
all kind of interconnected, as we've seen in a bunch
of ways. I think very soon Tesla and the Optimist
robot are going to part ways and it'll be like
the Optimist company or you know, it'll be something like that.
(04:18):
That's that'll happen. Just to get specific investment.
Speaker 3 (04:22):
Point of order, mister Frederick, We the fellow conspiracy realist,
aren't going to let you go on this one. What's
an Optimist?
Speaker 4 (04:31):
Is it Prime?
Speaker 2 (04:32):
Oh? There will be I guarantee there will be an
attempt to make an Optimist Prime branded robot. But yes,
Optimist is the bipedal robot that the company Tesla is
currently attempting to create something that is commercially viable that
would exist in homes and in you know, businesses, in
(04:54):
places to be a helper, a robot helper just like
a human, but a robot. And there are a lot
of companies out there that are attempting to make these.
We've talked about it a few times, I think specifically
on Strange News. Another big company is Figure. Another big
company is Agibot, whose website we cannot visit by the way,
if we're connected to our iHeartRadio system, because it's a
(05:17):
site in China that they specifically say has a threat.
Oh yeah, so we can't go. Sorry, Agabot, I can't
even learn about your tech because I'm here in the US.
Figure though, has an incredible bot that you should all
check out called a Figure three that we have mentioned before.
Very stylish looking robot with some cool gray mesh stuff
going on. Very almost human.
Speaker 3 (05:40):
Anthropomorphized for sure.
Speaker 2 (05:42):
Yes, and all of these companies are attempting to make
something that would actually be a viable product for humans
to buy. That then you could have you know, thousands,
if not millions of these things shipped out to homes
and to buyers. The big question is who in the
heck is going to buy one of these things that
(06:02):
is going to cost as much as a car, if
not more than a car.
Speaker 4 (06:06):
If you'll see any of these tiktoks or you know,
reels or whatever making the rounds of people just basically
trying to get robots to malfunction absurdly, like run into
walls and like trying to get them to cook and
have them flinging food all over the place and throwing
pots and pans everywhere. And I don't know exactly the context.
(06:27):
On first glance, it makes it seem like the robots
are just not good at what they're supposed to be
good at but upon further examination, it does feel like
some of these folks are like trying their best to
make them do awful stuff. Yeah, but yeah, anyway.
Speaker 2 (06:40):
Well, and how could you ever tell that it's not
just a generated video. I mean, just because it doesn't
have that Soura mark on it doesn't mean it's not
a Saura video or something similar, you know. But at
the same time, we have seen these machines just malfunction
because they're not quite ready yet for these advanced levels
of vactuation and things like that that you see in
(07:03):
production lines for vehicles, production lines all over the place
that actually have that built in singular movement right picks
up a piece of whatever it's working on, does whatever
supposed work on, then puts it back down. That kind
of movement that is a trained specific movement to be
done by a robot. When there's an assembly line where
there's going to be a known object in a known place,
(07:25):
in a known angle or something. Sure, that movement occurs
every time an.
Speaker 3 (07:30):
Ada Z routine.
Speaker 2 (07:32):
Yes, So how do you get one of these bipedal
robots to do that thing when you've got so many variables?
When that can of Coca Cola that you want the
robot to go pick up for you out of the
fridge and bring it to you and open it or whatever.
Speaker 3 (07:48):
Without squeezing too hard.
Speaker 2 (07:50):
Oh yeah, there's so much stuff you got to do
just to make that happen. Right, you have to have
an awareness of everything and those small movements. How do
you get that exactly right? Well, Tesla and the Optimist
team is attempting to make this happen with humans, with
human test subjects.
Speaker 4 (08:09):
Well, it was funny when you say that, Matt. It
made me immediately think of that event where the Optimist
robots were sort of trotted out, but it turns out
they were being remotely controlled by humans from off site
or in another room, even like you know, elaborate expensive puppets.
Speaker 3 (08:23):
So the old question would be how do you do that?
Which is the question people posed with the mechanical turt
of old If we recall.
Speaker 2 (08:32):
That, yes, absolutely, and you're right. All the telemetric I
think they call it telemetric controls or something where someone
has actually got a control system and then they're the
ones doing that. There are a lot of allegations that
many of these the higher ups in these robotics companies
will get one of their machines telemetrically controlled. When you've
got a huge investor, maybe rolling through, which.
Speaker 4 (08:53):
Seems like fraud. I mean, that just seems like absolute fraud.
Speaker 2 (08:56):
Well, and I think there's probably on the level discussion like, hey,
this is what they will be able to do, and
you can see how they move and all this. We're
on our way there. We just need more funding kind.
Speaker 3 (09:07):
Of Yeah, this is our this is our demo of
the bomb, right and if you give us enough money,
we can make a bomb.
Speaker 2 (09:15):
Oh for sure, and you're gonna low our bombs, you're
gonna buy them.
Speaker 3 (09:18):
And yeah, and we have a subscription model, right follow
us on x dot com for sure.
Speaker 2 (09:26):
Absolutely. Okay, So how do you get these or us
to do it? You have human test subjects. This is
where we get to the Optimist Lab. Business Insider and
Futurism and other places are calling it a secret Tesla
lab at the engineering headquarters for Tesla and Palo Alto
in California, and they're saying it's all glass and it
(09:48):
looks super futuristic. But inside this place you will find
something very similar to maybe what you remember seeing. If
you watched the HBO series Westworld. These glass kind of
enclosures where stuff happens. You remember this on the.
Speaker 3 (10:04):
Testing Yeah you remember this?
Speaker 2 (10:06):
Yeah, okay, so but inside here for real, in real life,
there are dozens and dozens of human beings in a
motion capture suit, wearing a backpack that goes up to
about forty pounds depending on what the battery pack simulated
for the robot would be. And they do things like
stand in a rest position that the robot would be in,
(10:29):
which is specific and detailed, right, what an option just
looks like at rest? Then they move one arm up,
they lean down a little bit, pick up a cup
on the table, pick it up like a robot, hold
it like a robot, and then put it back down.
And guys, they have to do these tasks for hours
and hours and hours a day until they get a break.
They have to get on a daily basis, these human
(10:49):
test subjects, they're called data collectors.
Speaker 3 (10:52):
They have to get.
Speaker 2 (10:54):
Yeah, they have to get four hours of footage per
day of themselves as a robot doing things. And we're
talking about vacuuming. What does vacuuming look like? And holding
specific sizes of vacuums.
Speaker 3 (11:06):
It reminds me a lot of mocap o cap.
Speaker 2 (11:08):
I was thinking the same thing. It is amazing, the
whole thing is motion capture. But I'm just thinking about
this weird world we're in now, where we've got human
data annotators that are taking all of the stuff on
the web that these large language models are being trained
on and putting little notes about what this is, what
this is about, what is being said in this website
(11:31):
about this subject. We've got humans in lab scenarios doing
the things that robots will one day do to train
the robots. It feels so strange and wrong to me.
I just wanted to see if you guys felt that
same way, knowing that there are human beings training these
robots that will eventually be walking around and sold to
(11:51):
people that have houses like the ones on figure dot
AI's website. It's like, the most they've got this robot
walking around the most palatial, marble floored places, you know,
five million dollar homes on the exterior, walking something in
on the gravel driveway. I'm just trying to imagine what
the hell are these things going to actually be doing,
and who's going to be owning them? And why is
(12:13):
it such a big deal and how could such a
huge amount of the global economy being pushed into this stuff?
As this is the thing, right it? But somehow the
human toil still isn't enough to get a Tesla robot,
one of these optimist guys to do some pretty basic
things like go get a coke out of the refrigerator
(12:35):
for Mark Benioff.
Speaker 3 (12:39):
And your series. And I love Matt the Matt Nole.
I love the comparison of a large language model sort
of plagiarizing earlier human thought. And now the idea is
that we are creating a thing that is taking an
(13:01):
l l M approach to physical action, right, so scraping
the idea of taking a coat from a refrigerator, or
scraping the idea of you know, scraping a floor or
brooming sweeping. Sweeping is the word I'm still learning English.
Speaker 4 (13:24):
Same, you know. It's that's really interesting, because isn't the
isn't the crux of like why we don't yet have
Rosy the robot level, you know, autonomous assistance, the nature
of mapping to a space and avoiding all of the
random things that could pop up as impediments to a
(13:44):
simple task, and that's where the adaptation part is so important,
not just learning the moves right. It's bigger than that.
Speaker 2 (13:51):
Well yeah, and yes, absolutely, mapping environment is massive. And
also and as you're saying the variables, you think about
the weimos and automatic driving cars and all that stuff
and attempting to avoid something like a cat that runs
across the street. I don't know if you guys saw
the there's a national news story about a cat that
got hit by a weimo and how big of a
(14:12):
deal it was.
Speaker 3 (14:13):
I'm read up on cat news for sure.
Speaker 5 (14:15):
We go.
Speaker 4 (14:16):
Yeah, we follow that hashtag. Also, like you see those
weamos get spun up into a tizzy because of something
really simple and all of a sudden, they're like, does
not compute.
Speaker 2 (14:26):
You know, it's a machine that's attempting to make the
next best decision and that is not an easy thing.
That's a reason why our brains are so awesome because
we can do that pretty quickly, and we can kind
of make multiple decisions at once and think about the
future decision, what that's going to mean, and the past
decisions that we made, and how this one's going to
be different than the one we made.
Speaker 3 (14:45):
Before and not consciously think about that your background software humans,
it's doing a lot of work for you.
Speaker 2 (14:54):
Yes, but the way our stuff is attenuated, right, all
these limbs we've got in our sense machinery that we've
got going, it's just been here. It's there's been millions
of years, if not hundreds and hundreds of thousands of
years of evolution that goes into making this stuff work.
Speaker 3 (15:11):
With millions pro preception was a hard earned sensory trait.
Speaker 4 (15:15):
It also reminds me of like the whole I know,
Kung Fu thing and the Matrix where all of a
sudden they just dump all of the wealth of human
knowledge into you know, Neo's brain and now all of
a sudden he can do all of the martial arts.
But that's like mega sci fi, and that's you know,
trying to do it in a human with some sort
of brain computer interface, also relying on the malleability of
(15:36):
that brain. And what they're trying to do is teach
robots how to be human really quickly, you know, rapidly
with this like information dump, and I just think it's
much more nuanced than that.
Speaker 2 (15:49):
Yeah, No, I agree, I agree. It puts a pit
in my stomach when I when I read about it
and I watch this stuff, because you can just see
it coming. You've got You've got Elon Musk saying things like, oh,
we're gonna have five thousand of these optimist robots ready
by the end of the year. And this is him
talking to potential investors, right, and then he says, well,
hopefully we're going to be creating, you know, to the
(16:10):
scale of a million robots per year eventually. And it's
a multi trillion dollar market according to the people who
have money who invest money in things. But nobody can
fully grasp who's actually going to buy these and how
are you going to get that many in the marketplace
to make this a multi trillion dollar market.
Speaker 3 (16:28):
And after a certain economic threshold, what consumer base will
exist to afford them?
Speaker 2 (16:35):
Because all the jobs are going away, not all of them,
a lot of the jobs are going away, and the
people with money are deciding that that's the good thing
to do. And the people who use AI in their
corner offices, you know, in the top of Manhattan every
day say AI is the greatest. Oh my god, my
AI chatbot can do incredible things. Oh wow, this is
(16:56):
the future. Everyone should be doing this. Why do I
need this staff? They can't get can't collate all my
notes the way this chatbot can.
Speaker 3 (17:04):
Also, I will statistically be dead by eighty four human
years no matter what the hell I do. So they're
not incentivized to think in the long term.
Speaker 2 (17:15):
Oh for sure. So let's watch this quick clip that
was posted on x by mister Mark Benioff.
Speaker 5 (17:23):
Uh.
Speaker 2 (17:25):
Mark Bennioff, guy with money. No, I'm just sorry. You
could look him up. He's a very interesting person, Salesforce CEO.
He was hanging out, I guess with Elon Musk just
right next to him, and he was getting a little
demonstration of the optimist bot not long ago. So let's
watch this clip together. You can listen to it at home.
You can also find it on x slash. Bennioff being
(17:48):
Thrones guy.
Speaker 4 (17:49):
He's related to David Benioff of Oh maybe, Oh, Game
of Thrones fame. They're like, I did not know second cousins.
Speaker 2 (17:55):
That is very cool and congratulations to both of you guys.
Speaker 4 (18:02):
I had to check.
Speaker 2 (18:03):
Uh, let's watch this really quickly, just so we can
have an understanding of this is currently what an optimist
spot looks like when it's being shown off to a
potential investor.
Speaker 5 (18:14):
Hey, optimists, what are you doing there?
Speaker 4 (18:18):
Oh boy? Just chilling ready to help?
Speaker 5 (18:20):
Hey optimist, do you know where I can get a coke?
Speaker 4 (18:24):
Sorry? I don't.
Speaker 3 (18:28):
And have a real time info, but I can take
Oh my god, this guy sounds like me.
Speaker 5 (18:33):
Great, Yes, let's do that. Let's get a.
Speaker 6 (18:41):
Lag is intolerable, awesome, Let's into the kitchen and okay, okay, go,
I think it's I think we need to give it
a bit more.
Speaker 2 (18:52):
Okay, it's slowly turning and it's starting.
Speaker 3 (18:56):
It's doing a little walk.
Speaker 2 (18:58):
Listen your pants.
Speaker 3 (19:00):
Yeah, clank clank clank.
Speaker 2 (19:03):
Isn't that crazy?
Speaker 3 (19:05):
It's not very impressive, right, I mean, are humans impressive?
It's I I think we don't. I think we should
not Dylan beat me here. I think we should not
immediately on prototype ideas. The precedent is there, Matt.
Speaker 2 (19:24):
What do you mean?
Speaker 4 (19:25):
Well, I just meant it's it's been a minute. I
just feel like they should be a little bit further
along than that. Just I don't know. It just seemed
really unimpressive, very lackluster, Like I don't know.
Speaker 3 (19:34):
Sure, Yeah, the Wright brothers were unimpressive, right, I disagree.
Speaker 4 (19:41):
They're playing did a thing this? This is this guy
just I don't know. I get that it's hard to
make these things. I get that the technology is tough,
but it's just like that's what you're trotting out, Like,
I mean, the lag alone is.
Speaker 3 (19:52):
Just like really, I'm sorry, I just trot's an interesting
word too, because it's definitely trotting like it pooped it
span Well.
Speaker 2 (20:01):
It's just strange to me, guys, because I seem to
recall this company called Boston Dynamics that I watched, well, yes,
I think we all watched in awe as Boston Dynamics
every year, every quarter would roll out more and more
and impressively dexterous robots of you know, varying forms. And
(20:24):
we've been watching that for well over a decade now,
like well over a decade. So it just makes me
wonder what the difference is, right, And I think one
of the primary things is early on in the Boston
Dynamics videos, you had bots that were they were kind
of chained up, right, they had a power source, they
were helped a little bit. Yeah, But then eventually you
(20:47):
move on to those I guess you could call it
the dog bots, the the four legged bots that were
crazy dexterous, and then not long ago at all, you've
got the ones I can't remember the name.
Speaker 4 (20:58):
Of, the backpacky looking ones. Yeah, would like yeah, do
like flips and stuff.
Speaker 2 (21:02):
I can do flips, it can analyze, you know, it's
terrain and make kind of jumps and make those decisions.
It just makes me wonder. I wonder if it's a
price point thing and they're just going into a trying
to figure out how do you make something that is
going to be affordable by somebody in a mansion.
Speaker 4 (21:20):
But like, also, I know where the kitchen is, you
know what I mean, Like, if you can't get me
the coke, then what use are you. It's just an
odd demonstration for that to Who's that going to impress?
Is my question? And you're right, Matt, it is all
of that history of remembering those videos of those other
prototypes and just being really that yeah, I'm not trying.
Speaker 3 (21:41):
It's not a bad question, though, I think it's an
astute question. We can say this way. Look, the primary
issue here is a technology arms race, right, so now
the pitch for optimists appears to be we can make
some kind of whole home grown robot that is doing
(22:02):
things that other countries can already do. So that's our lag, right.
I think Nola used the word lag earlier, that was
not in direct If you look at the parable of
what's happening here, you can also see earlier historical precedence,
like the Taming of the Horse. Right, There's a reason
(22:26):
Jengis caught rolled through with a cavalry and it took
Western Europe a ton of time to figure it out.
So maybe this is a marketing grift. And I hate
to say it, you guys, but our buddy Elon doesn't
have the best track record for discovering stuff. He has
(22:46):
a track record for appropriating stuff.
Speaker 4 (22:49):
Well, he also has a track record of way over
promising and way under delivering.
Speaker 2 (22:54):
Yeah, and then just throwing a bunch of humans at
the hard problem. Right, And that's currently what's happening in
the space program in Tesla Optimus. But he's in a
board room very recently, what is this? This was on
the Verge from October twenty third. There was an earnings
call where Tesla CEO Ela Musk basically said I need
(23:18):
a trillion dollar package pay package to make this robot
army if you can go through and get the full quote.
Here's one My fundamental concern with regard to how much
voting control I have in Tesla, is if I go
ahead and build this enormous robot army, can I just
be ousted at some point in the future. That's my
biggest concern. The idea that the creator of the robot
(23:40):
army would no longer have full control of the robot
army he has built. It's kind of freaky.
Speaker 4 (23:47):
Yeah, well, as if that's supposed to be how it
works anyway, right, Like you know what I mean? Like,
is he meant to be like their overlord? Like? Is
that what he's saying? I need control? Control? In what respect?
Are you not designing and building a product that can
then be used by other organizations and entities This idea
of control, I'm a little confused about what he means.
(24:08):
It almost makes it sound like he wants to be
the one wielding the robot army.
Speaker 2 (24:12):
One could gather that. I would say, without a lot
of elucidation on what he meant, I ran waiver on this. Guys.
Elon Musk says he needs one trillion to control Tesla's
robot Army. That's the Verge. That's one of the ones
you can look up. Also secret elon Musk Lab collecting
data on every human activity to train robots. That's the
futurism article. And then the Business Insider one that that
(24:34):
one's based on is inside the glass walled Tesla lab,
where workers train the Optimist robot to act like a human.
That's from Business Insider. Those are the main stories. We'll
be right back afterward from our sponsor.
Speaker 4 (24:51):
And we've returned. I got two today. I think, both
of which might be worth the larger discussion, so I'm
going to keep each of them a little bit bite size.
But we're going to start with a topic that I
don't think we discussed in the first place, the recent
daylight heist at the famous Louver Museum in Paris, France
on October nineteenth, where thieves managed to steal royal jewels
(25:16):
in the middle of the day, like during visiting hours.
I believe there was an evacuation that was called. I
don't have too much information on exactly how that went down.
That's sort of outside the scope of this story, but
it did happen, and what also happened was it revealed
or brought to light some past investigations from twenty fourteen
(25:37):
about the organization or the museum's incredibly outdated and dangerously
vulnerable it security systems. Reading here from Let's see Business
Standard in a piece by Rim Jim Singh, this is
(26:00):
out of New Delhi, and he says here The Louver,
one of the world's most visited and famous museums, faced
a shocking theft on October nineteen. Thieves managed to steal
priceless royal jewels in broad daylight, raising serious security questions
about the museum. Following the heist, investigations revealed that the
Louver had been struggling with serious cybersecurity and maintenance issues
for over a decade, many of which had been repeatedly
(26:20):
flagged but never fixed. According to French daily paper Liberation.
Those aforementioned vulnerabilities that came to light in a previous
investigation were from December of twenty fourteen, when the National
Agency for the Security of Information Systems, a French agency,
audited the museum's security systems, tested the stability and vulnerability
(26:44):
of their network and other critical systems that included video
surveillance systems, alarms, and access controls. And a twenty six
page confidential report that was released. They found that typing
the word louver could get someone into their entire video surveillances,
while typing in the word fails granted access to a
(27:07):
proprietary software system developed by the Fales Group. Yeah, so
they were warned the LOUP that the applications and systems
deployed on the security network present numerous vulnerabilities, also referring
to outdated operating systems that needed upgrading. Literally using Windows
two thousand, You guys might recall that relatively short lived
(27:30):
OS out of Microsoft, a system, by the way, that
has not been supported in terms of updates, etc. For
NIH on over a decade. By twenty seventeen, the article
goes on another audit by the National Institute for Advanced
Studies and Security and Justice found many of the same
issues persisted. So this is two years after that initial
(27:52):
twenty six page review. The report warned that quote serious
deficiencies were observed in the overall system that the museum
could no longer ignore the potential risk of a breach.
This forty page report also highlighted issues including accessible rooftop
panels you know the kind you might see the pink
panther lowering self down from or like a mission impossible situation,
(28:14):
outdated video surveillance rooftops that were accessible during renovation work. Also,
it mentioned computers that run obsolete operating systems like Windows
two thousand and XP running without antivirus software or any
kind of you know, standardized password protection policy. As part
(28:36):
of a corporation, we are all. It's annoying when it happens,
but it makes a lot of sense that we get
notifications for changing our passwords periodically, and it has to
be relative different enough from the previous one. Some organizations
also require strong passwords that are autogenerated that like have
like tons of crazy keys. But here it's looking like
the louver or the word louve would have gotten you
(28:56):
int some key systems.
Speaker 2 (28:58):
So that's really that.
Speaker 4 (28:59):
I just I think, yeah, just really quickly a little
more details about the heist. Tappened on October nineteenth. Burglars
robbed the place in broad daylight at nine thirty am
after opening. They used a truck mounted ladder to get
to a second floor balcony, broke a window using grinders
like you would, you know, like the pink panther triggered
(29:20):
alarms and entered the Apollo Gallery, which is where the
Crown Jewels are housed French historical artifacts. Visitors were quickly
evacuated as police arrived, but the robbers had already escaped.
This included nine pieces of royal jewelry from the Napoleonic era,
as well as items from Queen Marie Amalie, Queen Hortense,
(29:42):
and Empress Marie Louise. The tierra in fact of Empress
Eugene encrusted with two thousand diamonds and two hundred pearls,
was stolen as well priceless priceless stuff. Manuel Macrone, the
President of France, referred to the heist as a attack
on our heritage that we cherish because it is our history.
So this is really a big problem for the loof
(30:03):
and it seems like something's gonna have to get someone's
gonna have to answer for this.
Speaker 2 (30:07):
Well, there are four suspects I think as of today
November fifth, that are any let's hear it. Well, there
are four suspects in custody, but the whereabouts currently of
the royal jewels and a lot of the other stuff
that was taken is unknown. It seems like those folks
aren't talking. They appear to be well, at least according
to several sources. I'm looking at ABC News right now,
(30:30):
and before I was looking at The New York Times
has something on the power of DNA databases in solving crimes.
CNN calls them local petty criminals in Paris. It seems
like they've at least got a few people, and if
they actually talk, then you might get answers. Otherwise, it
seems like maybe some of that stuff already got fenced.
Speaker 4 (30:52):
Oh, I wouldn't be surprised if they're not. They're not
able to recover it, certainly. I think they're going to
get some convictions or at very least find someone to
take the fall for pr purposes. But I would be
surprised if they got all of the pieces back.
Speaker 2 (31:03):
But we'll see very weird stuff.
Speaker 4 (31:06):
M moving on to one then, I think that you'll
be interested in it may well be something that's very
much on your radar uput entirely intended. Ukraine's game style
drone system changes how soldiers track and plan strikes coming from.
Speaker 3 (31:21):
Could you tell us about more about this, like are
they getting points?
Speaker 4 (31:25):
Yes, get it's crazy man Amazon for war is what
it's being referred to. It's a platform called Brave one
that allows soldiers to earn points for carrying out successful
drone attacks. And I swear I had to do a
double triple quadruple take on this and exchange them for weaponry.
(31:48):
It's like a marketplace of like drones and artillery, and
it's a point system that's basically like almost like you know,
Dave and Buster points where you can go into the
little thing and exchange it for like, you know, a
million points might get you a PlayStation or something like that.
Speaker 3 (32:05):
Oh, thank you for the DMB reference.
Speaker 4 (32:07):
Well, we love DMB and the points system I think
is appropriate there, although it seems like this one might
be a little more weighted in the favor of the
point earner, because it's an incentive that seems to be working.
Writing for Interesting Engineering, Sujita Sinha says a video game
style drone attack system has gone viral among Ukrainian military units,
(32:28):
turning real life warfare into a digital competition. The first
Deputy Prime Minister of Ukraine, Mikhailo Federov, spoke to the
Guardian newspaper saying that drone teams killed or wounded eighteen
thousand Russian soldiers. In September, around four hundred drone units
now take part in the system, up from only ninety
(32:49):
five in August. He says it's truly become incredibly popular.
The system works through this online platform called Brave one,
which is often described as Amazon for war, which for
one hundred different kinds of drones and other autonomous systems.
There is a leader board for this, y'all. Truly like
you know, like in some kind of you know, call
(33:10):
of duty situation. The Bonus system was launched over a
year ago. The article goes on to motivate drone teams
and enhance efficiency on the battlefield. It also apparently generates
some very interesting and allegedly useful information in terms of
mapping out the way these attacks are going down. It
(33:32):
adds this level of transparency, and we'll get to that
in a second. The program now extends beyond combat drones.
Artillery and reconnaissance teams also earn points for confirmed strikes
or for spotting enemy targets. Even logistics units can now
score points using autonomous vehicles. It's described in the piece
here as uber targeting, not like awesome targeting, but uber
(33:56):
like the ride hailing app. The idea that you drop
a pin on a map, like you for calling yourself
an uber, and then the next available unit picks up
on that pen and carries out the attack. It says
here points are awarded for specific types of missions. Killing
an enemy drone operator, as an example, earns twenty five points,
(34:19):
while capturing a Russian soldier earns one hundred and twenty points.
And this has been approved directly by the Ukrainian cabinet.
And apparently this is a I think I knew to
some degree the level of technology being employed in this war,
but their systems on the Ukrainian side have gotten increasingly
(34:39):
more technical. They've been at war for four years now
and they are finding ways to They're looking for efficiencies.
Feder Offset, We're thinking of this as just part of
our everyday job.
Speaker 2 (34:52):
So you get points by killing enemy infantry, you get
use those points to get more drones.
Speaker 4 (35:00):
It's like a video game system where you can upgrade
your car and a racer or something like that.
Speaker 3 (35:05):
You know, you also get points by doing a Scottie
Pippen move and Ali you being other folks by identifying target.
Speaker 4 (35:13):
Corrects the recon aspect of it, for sure.
Speaker 3 (35:15):
Yeah, that's correct point of order.
Speaker 2 (35:17):
How does one capture an enemy infantry person with a drone?
You like, have a drone that posentually has explosives and
you sit it in front of them and a camera
and location.
Speaker 3 (35:29):
We ultimately have to send the people out.
Speaker 4 (35:32):
Yeah, there's got to be some on the ground aspect
of that. I don't fully understand if the drone part
of it is. I think it's just I don't know.
It seems to me like the drones are what's capturing
it as well, Like in terms of verified you know,
there's like video evidence that this has happened, and then
that triggers earning the points or.
Speaker 2 (35:51):
What have you.
Speaker 3 (35:52):
Right, No, I'd say it's similar to the historical precedent
here would be the relationship between the corvids and the canids,
specifically ravens and wolves. So the ravens are kind of
the first drones, right for the wolves. So in this comparison,
you send out the drone which identifies the target, what
(36:14):
makes the helps the wolves know where to go. I
don't think that's new. That is a relatively workable and
trivial sort of symbiosis. But the big issue that I
think you're raising here is the idea of gamifying war
and gamifying human lots. Right, we're paying for that leader
(36:38):
board and blood, you know.
Speaker 4 (36:40):
It's interesting. Then to that point, my girlfriend works with
kids teaching math and was really surprised when speaking to
a young person and asking what they want to do,
they said they want to go into engineering. What kind
of engineering you might ask, weaponry? Like making that jump
immediately because we were talking and it's sort of like typically
(37:02):
you think of maybe you get an engineering degree and
then you specialize or you get placed in a company
that does that kind of stuff. But to have these
kids already thinking weaponry, I want to do that. I
think a lot of it has to do with the
appeal of the gamification of it all. And maybe that's
a little boomer of me, you know, it's all about
violent video games or whatever, but I think there's something
(37:23):
to it perhaps, And this is a great line here
from the piece, and then we'll move on. That explains
what we were talking about. Feder Off goes on, thanks
to the points, we're actually starting to understand more about
what's happening in the battlefield. All the strikes have to
be verified by video, and the government then takes that data,
analyzes which scenarios played out using which combinations of weapons
(37:46):
and tactics and recon and all of that, and then
they're able to use that data to further fine tune
things for future you know, conflicts or future ever these
kind of skirmishes, and apparently it's doing gangbusters for the
morale of troops. Experts from the Royal United Service Institute
(38:10):
are urging NATO countries to not do this. It's definitely
has its detractors.
Speaker 2 (38:17):
It's pretty weird because I know the POLE headmaster called
in and specifically guys gave me a bunch of crap
for playing Call of Duty Mobile and about how it's
a trash Call of Duty game and only kids play
it and that kind of thing. But I know firsthand
the poll of earning those points, right, And in that game,
you earn points by killing other players, right, That's how
(38:40):
you get those points, And when you get enough points,
you get new stuff, And I know exactly what that
feels like. I cannot imagine what that would be like
in this real world scenario, but I do know that
POLE is pretty universal.
Speaker 4 (38:52):
Well, it's that dopamine hit and that, like, yeah, gamification
is really useful. I love gamifying my fitness stuff for example,
or just like routine things or you know, trying to optimize,
find efficiencies, you know, get your patterns locked in so
that you're getting the most bang for your buck, like
life wise. I get the I get the pull of
that too. And also been playing a hell of a
(39:13):
lot of Sonic Racing Cross Worlds where you earn these
tickets every time you complete a raise that you can
then save up and buy cool stuff for your racing
pod or whatever.
Speaker 5 (39:22):
Mm hm.
Speaker 4 (39:23):
So yeah, there you go. I think those are two.
Maybe we talk a little bit about more in the future.
Keep an eye on. Let's take a quick break here
with from our sponsor, and then we'll come back with
the last segment in today's a Strange News episode.
Speaker 3 (39:38):
And we have returned with the last act of our
weekly Strange News segment. Who went to the grocery store?
Who went to the grocery store? That's out anytime? Yeah,
I think I went always. Yeah, historically I like a
grocery shop. I do so you guys order your stuff
in person.
Speaker 4 (39:57):
I get delivery occasionally, but I actually like a trip
to the grocery store.
Speaker 2 (40:00):
I like you too.
Speaker 4 (40:01):
I like doing small shops a couple times a week
rather than like massive ones.
Speaker 3 (40:04):
It's one of those rare third spaces that we talk
about in sociology. So for our last act of this
weekly Strange News segment, we wanted to introduce you to
something that might get us in trouble and indeed is
a setup for a episode in the future. Gosh, let
me step back, guys. Remember when we talked about dynamic pricing? Yeah,
(40:28):
not too long ago. Can we define dynamic pricing just like.
Speaker 4 (40:32):
You know on the fly price fluctuations in real time
of things like fast food? Specifically, in this case, things
like groceries. The idea of even some places wanting to
change the price tag spots on the shelves to a
digital readout that can fluctuate in real time based on
(40:53):
the market, based on supply and demand. All of that
stuff never seemed like a great thing for consumers.
Speaker 2 (40:58):
And the last time we talked about I think we
even talked about personalized pricing in a store when you
walk through.
Speaker 3 (41:05):
So if we want to play a game, I would
I would recommend if you're not driving, folks, pull up
your ride share app of choice, right, if you have
an Uber or Lyft, do you guys use Uber Lift?
Matt Nol typically use Lyft because it turns delta skuyme old,
Oh nice, that's the game. Knowle's putting you on game gull.
Speaker 4 (41:28):
Plarty points man points.
Speaker 3 (41:32):
So here's an interesting thing you can do to see
the dynamic pricing conspiracy at work. We're going to show
you how the rabbit comes out of the hat. Folks,
go ahead and pull up these apps if you have them.
If you have one, great, if you have both. More interesting.
If you pull up an Uber ride for let's say
(41:57):
the iHeart Studio to the Atlanta Airport and Heartsfeld. If
you pull it up on Uber and you pull it
up on Lyft, you are going to get a different price.
If you pull it up on on the way, you
are going to get a different price. If you pull
it up next to If, for instance, Noel, Matt Tennessee
and I were hanging out with you and we all
(42:21):
pulled it up at the same time, we might three
out of the five of us get different prices. Some
would be higher, some would be lower.
Speaker 4 (42:30):
Depending where you are. I have a bit of a
life act that probably everybody knows about. But if you
click weight and Save, it almost always comes just as
fast as the regular one, and it is significantly cheaper
right up front. But I've never clicked weight and Save
short of it being mega, mega trafficking time where it
hasn't come right away. It's almost like a trick. They
don't want you to click that one. But if you
(42:51):
click it, you'll save some money and they'll come just
as fast.
Speaker 3 (42:54):
That's some stuff they don't want you to know. So
how how does this translate to what Kroger's up to?
Every time you step into a grocery store and thank
you for the setup there, noil, I appreciate that. Ali
youu you are entering into a vast collection mechanism as
we're going to see. You know, we talked briefly about
(43:15):
the increasingly monopolistic drives of grocery stores. They're making a
ton of money off you, fellow shoppers, because of the
things like the loyalty cards. But it goes way deeper
than that. There is a vast array of mechanisms that
you will not clock. They are meant to track, analyze, share,
(43:40):
and further influence your shopping behavior. I'd like to shout
out our longtime conspiracy realist Matthias for hipping me to this,
who was in talks with a lawyer who is currently
deep in the trenches of figuring out whether or not
this is legal. If you want to learn more, good
at epic dot org. February fourteenth, twenty twenty five. Happy
(44:05):
former Valentine's Day to everybody belated. Uh. The headline is
Krueger's surveillance pricing harms consumers and raises prices with or
without facial recognition. So shout out to Mayu Tobin Miyaji
who originally wrote this. Grocery stores know so much about you.
(44:27):
Did either of you guys ever work at a grocery store?
Speaker 4 (44:30):
Never did? No, No, I know you got it did
for sure, like bag bagging and doing the carts and
all that stuff.
Speaker 3 (44:36):
Okay, yeah, so we I mean we all worked in
various things before our podcasting lives together. I think we
got some mellow mushroom in the crowd. I think we
got some Piedmont driving club. We got a vegan place
out in Knoxville, Dylan, what was the vegan restaurant there
in Knoxville?
Speaker 2 (44:55):
Vegi Rama, great name.
Speaker 4 (44:57):
Wow, they really locked that one down.
Speaker 2 (45:00):
Did you ever work at a like a Kroger or something, Ben.
Speaker 3 (45:04):
Yes, I did. Yeah, Matt, thank you for asking. I did,
and they would not fire me. I eventually just had
to leave.
Speaker 4 (45:12):
You were just too good.
Speaker 3 (45:14):
I kept trying to I kept turning in my notice.
It was very Seinfeld. I kept turning in my notice,
and they kept scheduling me. And then one day I
just said, you know, I can't do this anymore. I'm sixteen,
I have a Monte Carlo. Stuff's looking up for me.
And they were like, all right, we'll see you Thursday.
And I simply did not see them Thursday. This was
(45:36):
out of time before grocery stores were collecting all this
information and started with the loyalty card. Right now, your
local grocery store, especially if it's a big chain, may
know your age, your identified gender, your race, your economic
status aka how much will this guy spend on some beings.
(46:00):
They'll know your family makeup, right. Does this cardholder have children?
Does this cardholder have an elderly person? Does this cardholder
perhaps use snap benefits? Yes? Oh I love that point. No,
And then the other question is what are their health conditions,
whether there are other lifestyle characteristics. Vaguely put, This, combined
(46:25):
with the other hidden income streams of what we call
data brokers, means that grocery stores may well be as
conspiratorial as it sounds. They may well be the next
step in an Orwellian nineteen eighty four. So let's say
(46:46):
we're walking up. Let's make up characters, everybody. Let's make
up a character very different from ourselves. I'll be Yulanda,
recently arrived from you praying, I'm very intro in yoga.
I have four kids, and I'm looking after my elderly grandmother.
(47:06):
So Yolanda, all right, put some folks in the spotlight, Noel,
what's a character? They get a character? Bluie Blue? All right?
Tell us about Blue.
Speaker 4 (47:17):
He's the you know, the cute Australian dog, the cartoon
dog the kids love.
Speaker 3 (47:21):
All right, He's a cartoon Australian dog. The kids love him.
What does Blue buy Kroger kibbles? He buys kibbles, all right,
all right, So we got Yolanda, we got Blue, Matt.
Let's get a character.
Speaker 2 (47:36):
Pete Raisinson. Pete works there at the at the local
Kroger in Temecula, and Pete is he produce. Pete doesn't
make a lot of money. Pete is just you know,
works at the counter, but he makes enough money to
stay there and he can support, you know, just his
(47:58):
lifestyle as a twenty two year old young man.
Speaker 4 (48:01):
M M.
Speaker 3 (48:02):
What department does Pete work in.
Speaker 2 (48:04):
He's in the checkout counter. He's one of the last
remaining checkout people.
Speaker 3 (48:07):
He's a human checkout counter guy.
Speaker 2 (48:09):
Yeah, yeah, yeah, that's what he does primarily. Sometimes he
stands at you know, where all the robot checkout things
are sure and just make sure nobody needs to get
an ID checked or anything.
Speaker 3 (48:20):
Right. Okay, So we got Yolanda, we got Bluie, we
got Pete Raisinson, and we have one more character on
stage a courtesy of our pals, super producer Tennessee Dylan,
tell us about your character.
Speaker 2 (48:34):
Chip Diggin's golf commentator, father of three, divorced Dad has
had his frozen fruit for his smoothies.
Speaker 3 (48:40):
Here going by classic all right, So everything we have
learned about dynamic pricing really informs us about surveillance pricing.
We just off the dome, made up four awesome characters,
and honestly, thank you guys, I think that was great work.
Surveillance pricing will not only know all the characteristics we
(49:03):
have described, but will leverage this such that you could
be standing with Blue and Pete raisinsin right now in
Aisle twelve, and you could be looking at.
Speaker 4 (49:15):
The same can of beans.
Speaker 3 (49:17):
And that I don't know why I keep going back,
but that same can of beans might be ninety eight
cents for Bluey, it might be a dollar thirteen for
Pete Raisinson, and it might be four dollars for Yolanda.
This is a very strange thing. I think it needs
more attention, and lawmakers agree because here's the next step.
(49:43):
Here's the thing we didn't talk about earlier. You guys.
Facial recognition is happening in grocery stores.
Speaker 2 (49:52):
Okay, So so the grocery store would be able to
identify Bluie pretty clearly, I think when Blue walks in,
countis from Australia. Yeah, yeah, sure, sure, sure for sure,
and then you know all these other characters. So you're
gonna have facial recognition that is linked to whatever card
(50:14):
that you use, or phone number you use, or even
sometimes the credit card or bank card that you use
to buy groceries. That all gets wrapped up into one
big piece of information about you. So in this case
it's also going to be your face.
Speaker 3 (50:29):
Yes, that's correct, Matt or Pete, whichever you prefer. In
this case, it is correct. So we can posit then
that the self checkout line is not just a AI
automation rush. It is a rush on personal information. Kroger
is partnering with Microsoft. The facial recognition is being installed,
(50:55):
creating what they don't call surveillance pricing, what they don't
call dynamic pricing. They call it a personalized shopping experience. Now,
going back to Noel Brown's earlier or blue if you prefer, Nol,
going back to the earlier statement about things seeming awry
or JANKI, I don't think this benefits consumers at all.
(51:21):
I'm not sure who it benefits. What the shareholders of
the grocery store. I oh uh, Noel er Bluey said earlier.
The idea of don't call me a boomer, as we're
all getting older, and not to sound like a boomer,
nor to agree with you necessarily, Nol. But this feels
a little too far right. Shouldn't a price of a
(51:43):
can of beans be the same price?
Speaker 4 (51:47):
It should be how much can I make can of
beans cost Michael ten dollars? No, it's yeah, I mean
I don't know. I get a certain amount of fluctuation
based on availability of products, based on the tchaining all
that stuff. This pivoting to this kind of system and
trying to sell it as though it in some way
is a benefit to anybody other than the company's is
(52:10):
the most absurd and egregious form of gaslighting. It's really yeah,
it's wild.
Speaker 2 (52:16):
It's very strange. I don't see how they could ever
get away with it. Like, if groceries cost more for
different people, wouldn't we all raise our hands and say, hey,
this is not right. Why is this happening. You could
get your friend who gets the really low prices to
(52:36):
go in and buy all your groceries for you.
Speaker 4 (52:38):
Remember when we talked about the Snap benefits thing may
I don't remember where this came up, but this idea
that the grocery stores weren't allowed to give discounts to
Snap holders because that in some way violated some sort
of fair play law or something. I can't remember exactly
the deal, but this seems that it squarely in a
(52:59):
similar gray area that's actually useless. The thing that we
were talking about before offering the discounts. If snap benefits
dried up, that would have been like doing the solid
for people in need. This just feels like it's just
absolutely the opposite of that.
Speaker 2 (53:14):
But what we're talking about here right now is dynamic pricing, right,
We're not talking about individual pricing like personal pricing.
Speaker 4 (53:20):
Yes, that's the nth degree awful version of it.
Speaker 3 (53:24):
We're we're we're midway through the we're in the interim
of the spectrum, right, So it's surveillance pricing. Now it
will become individual pricing.
Speaker 2 (53:35):
And surveillance pricing says like what like, therefore, this is
the price you get.
Speaker 3 (53:41):
Right, Surveillance pricing says Yolanda walks in. Yolanda gets clocked
with a camera at the beginning, right, and has some
sort of profile that pulls up which dictates the price
of beans or the price of chabadi yogurt, you know.
And then you'll Londa checks out right at a self
(54:04):
checkout counter, and that will also clock her face, maybe
even people with her, and then it will feed the
data of the purchase into that overall profile or fingerprint,
which informs the next time you'll landa walks in and
does or does not buy chobani.
Speaker 4 (54:24):
And the new thing here is this facial recognition piece. Right,
if you've if you've ever checked out at one of
these things, you'll notice you're on camera. They could argue
it's deth deterrent or what have you. But that information
is going somewhere. I don't think you have to sign
anything in the terms that say they have to delete it.
At the very least, that's what the TSA promises. I
don't think you've given any such promise from the groceries,
(54:47):
from the grocery chains.
Speaker 3 (54:49):
We have not because it's not legally requisite. Right, there
doesn't need to be a sign that says you are
surveiled and stalled. This, I would posit is episode in
the future. This is part of what we will call
a K shaped economy, which we don't have time to
(55:09):
talk about today.
Speaker 2 (55:10):
Well, what does that? What does that mean?
Speaker 3 (55:12):
A K shaped economy is an increasing violation of what
we call the Genie index. K shaped economy means that
part of a given economic structure is spiraling and another
part of it is skyrocketing. It was ultimately untenable.
Speaker 4 (55:33):
It's a big divider of class.
Speaker 3 (55:36):
Sure a capital K, right, and then picture the upward
part being the one to ten percent and the downward
part being everybody else. Anyway, thanks for listening to our show, folks.
I don't know, do you guys think this surveillance and
grocery stores? Like? Am I too into it? Or is
(55:57):
this an episode?
Speaker 4 (55:58):
Oh? It's definitely an episode. I'd be interested too. What
larger implications this might speak to about the availability of
the stuff and just how it plays into larger surveillance
state aspects and well that but also just literally just
another layer of the surveillance state, you know. And who's
(56:18):
to say that the government can't use that information as well,
or that it isn't feeding into some database that's accessible
that in that way.
Speaker 2 (56:26):
I just don't think they can get away with it.
You could have. So what you need to do is
have if you're in the top ten percent, you need
to make a friend who's in the lowest ten percent, right,
and vice versa. Then you two go to the store
together every time you buy each other's groceries.
Speaker 4 (56:44):
Criss cross yess on a train rolls.
Speaker 3 (56:48):
I love it. Yeah, very well done, guys, and thank you. Yeah,
I think this can be a good episode in the future.
There's so much stuff we didn't get to, including the
fact that the online retail giant Sheen Shane I haven't
heard of them s h E I N. They have
banned the sale of sex dolls. The Interstellar Visitor three
(57:12):
I Atlas has changed color. We'll have updates on that.
For now, we cannot thank you enough for your time.
We cannot thank our super producer, Dylan the Tennessee pal
Fagan enough for his time, but we'll try it. Dylan,
thank you. How do we do today?
Speaker 5 (57:29):
Excellent job? Boys?
Speaker 3 (57:30):
Oh come on, I love it when you lie to us,
so we want to hear from you. So join up
with the crew. Find us out here in the dark.
You can hit us on the telephonic devices. You can
always write to us with awesome stuff, and you can
find us on the lines the social needs should thou sip.
Speaker 4 (57:50):
Find us indeed all over the lines at the handle
Conspiracy Stuffhere. We exist on Facebook with our Facebook group
Here's where it gets crazy, on xfka, Twitter, and on YouTube.
You can also find that on Instagram and TikTok at
Conspiracy Stuff Show.
Speaker 2 (58:03):
We have a phone number. It is one eight to
three three std WYTK. When you call in, give yourself
a cool nickname and let us know within the message
if we can use your name and message on the air.
If you want to send us an email, you can
do that too.
Speaker 3 (58:13):
We are the entities that read each piece of correspondence
we receive. Be well aware, yet I'm afraid sometimes the
void writes back. For everybody who listened to the end
of our weekly Strange News segment, there is something that
you guys said that required me to pull up an
old book. It's called Wealth and Poverty by George Gilder.
(58:37):
Here's the quote. Wealth and poverty are the prime concerns
of economics. But they are subjects too vast and too
vital be left to the economist alone. I think we
can agree with that. Find us in the Dark Conspiracy
at iHeartRadio dot com.
Speaker 2 (59:14):
Stuff they Don't Want You to Know is a production
of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app,
Apple Podcasts, or wherever you listen to your favorite shows.