Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:53):
Okay, it's a search, weren't and we're live. You can
listen to us on cage A Radio. You can watch
us on Rumble, LinkedIn Facebook, and later we'll be on iHeart, Spotify,
and Speaker. We're broadcasting from Salt Lake City, Utah, and
Hudson Valley, New York and today. Our producer is Bill
(01:17):
So thank you very much both for helping us. Let
me tell you who we've got to start with. We've
got Detective Jake Jacobs from Philadelphia Homicide Notes for Involved
Tuning Unit, and he's not currently in Philadelphia. He's at
an underclosed, undisclosed location, so his trash is being picked
up for those of you who live in Philadelphia and
(01:38):
don't have your trash picked up. Also, we've got our
correspondent from Tennessee and we'll be talking about Tennessee veteran
NCI especial agent Greg Highlands. Greg, how you doing. So
there's a lot of shit happened in Tennessee, so we'll
cover some of that, but I'll give you an idea
(02:02):
what the how we are going to talk about. Okay,
the title of this episode is artificial intelligence, and you'll
see why, especially in the second half hour. Our special
guest is Christine Lynn Harvey, the editor and chief of
New Living News Independent Media. As I said before, we're
(02:22):
going to touch on artificial intelligence. We're gonna talk about
the Brian coe Berger plea of guilty, the Karen Reid
trial fallout that's still taking place, The Sarah Birchmore murder trial.
Speaker 2 (02:41):
HM.
Speaker 1 (02:43):
That's next on the agenda in the Canton here Boston area.
A socialist takeover in New York July fourth, protests, artificial
intelligence and healthcare. How do we feel about that? The
P Diddy trial incidentally, Uh, you know, we had the
(03:05):
Karen read A verdict live, and I was hoping that
we would have the P. Diddy verdict live, but you know,
they came to that. I came out of the out
that early. So we got screwed on that. Trials on
the Horizon, Alligator Alli, our Alligator Alcatraz, we'll talk about that.
(03:27):
Philadelphia Crows in our Corruption Rock for Prosecutions update and
more so. Anyway, Okay, so I wanted to let's try this. Okay,
let's look at a couple of these things quickly and
get them out of the road before we I wanted to.
(03:49):
I was remiss in showing this. Last week, Lacey Lunsford
was kind enough to give a speech in front of
the FOP. It was an honor being a guest speaker
at the New Mexico Paternal Order of Police conference. Seeing
Brad get a standing ovation and being surrounded by people
who understand that support us was overwhelming. That's great. I
(04:14):
wish the FOP in Philadelphia and everywhere else would be
like that. Right. So, okay, so this Tennessee thing, I
want to get into this a little bit. Let's see. Okay,
(04:35):
Big Tennessee now targeting anyone who gives shelter to illegals,
including their families and handle arts. New immigration laws take
the crackdown a step further. Laws create sweeping criminal penalties
for anyone who provides shelter to an undocumented immigrant. The
American Immigration Council says they make it illegal for a
(04:58):
church to offer temporary shelter, a landlord to rent out
a room, or a family member to lead to live
with someone who is undocumented. Wow, this should crack down.
A college campuses in that, shouldn't they? So have you
been rated yet? Greg for having illegals living with you?
(05:21):
Or anything?
Speaker 3 (05:22):
Not not yet. No, no, okay, So I have an
extra dog this week.
Speaker 1 (05:29):
Oh you do what kind.
Speaker 3 (05:31):
Some kind of Japanese dog.
Speaker 1 (05:32):
It's my daughter's, Okay, So Bill, can you bring that
to the top there. One hundred and seventy seven new
Tennessee laws go into effect July first. Here are the
ones that caught our eye. And if you come down
a little bit, it's harboring illegal immigrants. No cell phones
(05:55):
in a classroom. Wow? Nice?
Speaker 2 (05:58):
Uh?
Speaker 1 (05:59):
More recess for elementary kids, new penalties for threats of
mass violence. License revoked for bullying for teenage drivers that
have been bullying people. I mean it goes on and on.
You really gotta look at that, you know, it's like
(06:19):
a common sense Tennessee is the only state that has
common sense nowadays, right, So don't it boy, that's for sure.
Jesus God, it's a disaster.
Speaker 2 (06:34):
Okay.
Speaker 1 (06:37):
I did want to you know, I want to get
into this, uh what the hell's happening? Even past the
Karen Reid being acquitted thing. But there's another trial in
Can't Canton, and I want to run these two things
(07:00):
so people have some idea of what the hell is
going on?
Speaker 4 (07:02):
Okay, Now that the Careen Reid trial is done, the
community of Canton will move on to yet another murder
trial filled with questions, uncertainty, and potential misconduct from police officers.
In February of twenty twenty one, twenty four year old
Sandra Birchmore was found unalived in her apartment in Canton, Massachusetts.
(07:27):
It was originally concluded that she had unlived herself via strangulation,
but after additional investigation and prodding from her family and
some investigative journalists, a man by the name of Matthew
Farwell was arrested and charged with the murder of Sandra Birchmore.
Matthew was a police officer in a nearby town called Stoughton,
(07:49):
which is near Canton and in the same county of Norfolk.
He had originally met Sandra when she was only twelve
years old and she had joined a junior police academy
me called the Explorer's Program. It is alleged that Matthew
started a sexual relationship with Sandra that lasted four years.
(08:09):
In fact, she believed that she was pregnant with his child,
although some very recent reports are telling us that DNA
is saying that he was not in fact the father
of the child. However, text messages prove that she did
share with him that he was the father of her child,
and it is believed that that was the motive for
the alleged murder. The Canton Police Department was investigated once
(08:34):
again for this investigation, and there were multiple findings that
showed that they should have done things a lot differently.
So there's a lot more detail. I have a full
video pinned on my page. Go take a look at
it for all of the details surrounding this. But goodness,
what is going on in Canton?
Speaker 1 (08:55):
Okay? Okay, yeah, okay, let me Jussu Bill hangout. I'll
run this other video so they have an idea of
what the hell's going on. This thing is really I mean,
you know, what is going on in Canton?
Speaker 2 (09:08):
Man? You know?
Speaker 5 (09:11):
First to tell you, I don't know much about this
because I tend to watch trials as they play out
once the core hearings begin, not pre trial. But this
is Sandra Birchmore. This was how old she was when
she got involved in like a junior youth program with
the police officers. And they it's bad, It's really really bad.
(09:33):
You see, she's with child this was the last day
she was alive. You know how this happens. There were
several I think three police officers at the time of
her death rumor to be involved with her. Her death
was originally ruled self elimination. Case close reporter persisted, reopened
(09:55):
and the person who was last seeing going into her
part heartment, is being held by a federal He was
federally indicted. So there's just a lot and the same
players are involved in the Karen Reid case are involved
in this case. There were thirty thousand text messages missed
(10:17):
on the person who is in being the officer being held.
Speaker 2 (10:21):
For her murder.
Speaker 5 (10:22):
Thirty thousand text messages missed. The same person who testified
about the data in the Karen Reid trial. He's the
one who extracted data in that case as well. Thirty
thousand missed text messages and Beth can only ruled that
they were not allowed to bring that up in questioning
in Karen's trial because it wasn't relevant.
Speaker 1 (10:44):
Man, that should be interesting. Huh So all right, let
me uh, okay, well let's go. Let's uh you guys
are up on what the hell happened to d right
this morning? Okay, I'm just going to run this quick
and then you can tell me I haven't been following
(11:04):
it other than the fact that he collapsed.
Speaker 2 (11:07):
At the end.
Speaker 6 (11:08):
We've reached a verdict on counts two, three, four, and five,
read the note from the jury that came just after
four o'clock. But we cannot reach a verdict on count one,
they wrote, because some jurors have unpersuadable views. That means
they've reached a unanimous decision on whether Sean Diddy Colmbs
is guilty of sex trafficking and transportation to engage in prostitution.
(11:30):
But the jury of eight men and four women is
split on the most serious and complex charge, racketeering conspiracy.
For that count, jurors must decide whether they believe Colmbs
ran a criminal organization and committed underlying crimes like kidnapping
and arson. Late today, the judge urging the jury to
keep deliberating to try to reach a decision on that
last count before revealing their verdict. As the jury's note
(11:53):
came in, Colmbs's attorneys hugged and formed a circle around
the music mogul, who put his head in his hands
at one point giving a thought come up to his family.
During the seven week trial, prosecutors alleged that the music
Mobile ran a criminal enterprise and used power, violence, and
fear to get what he wanted, calling thirty four witnesses,
including two of Colmbs's former girlfriends, Cassie Ventura and another
(12:16):
known only as Jane Comb's opting not to testify. His
defense team calling no witnesses, telling the jury that the
claims against Colmbs were badly exaggerated and that physical abuse
and love of baby oil are not federal crimes. The
jury will resume deliberations tomorrow morning. Chloe Maloss, NBC News,
New York.
Speaker 1 (12:37):
Love of baby oil, Jacob already baby oil down there
where you're at.
Speaker 2 (12:44):
I went to CBS actually sold out and jos and
Josin stocks went roof.
Speaker 1 (12:52):
Oh my god, So what the hell is the deal?
From this morning? He's found He was found not guilty
of de rico. Is that right?
Speaker 3 (13:03):
No reason on everything except for the trafficking for prostitution,
which is he might actually get time served for.
Speaker 2 (13:17):
He's probably about the end of the day, probably ankle
monitor and he'll put out.
Speaker 1 (13:27):
His uh bodyguard what do they call him? I can't
remember his name, but big something anyway, he's leaving the country.
He's the one that you know, tipped off everybody, I guess,
so he's leaving the country. But anyway, any more thoughts
(13:51):
on Diddy, I mean, you know that would be We're.
Speaker 3 (13:54):
Going to be worried, that's for sure. A lot of
people who came in and it was one of the questions,
you know, they said about government overreaching the case, was
that they had people come in and give story type
uh information that weren't related particularly to the case, like
they had. I'm trying to remember the one singer she
(14:18):
came in and was brought in information from Prince who's
been dead for a couple of years, and tell stories
that we can't stories that he told her over the years.
And so there's a lot of people that are gonna
be worried in the you know, the entertainment industry now
that he's gonna most likely be walking in the streets soon.
Speaker 2 (14:41):
Yeah, slippery right outside the courthouse right now?
Speaker 1 (14:49):
Oh really yeah, Baby all over the place, Lindy leu Uh.
He's writing his guest list now for his release today.
Baby oil waiting, Yes, incident, Wendy.
Speaker 3 (15:04):
I'm guilty on where transportation to engage in prostitution, which
evidently is less than a ten year sentence ten years
or less.
Speaker 1 (15:14):
But isn't he facing twenty years with the other ones?
Speaker 3 (15:17):
He was, but he's been found not guilty on the
other ones, So there were.
Speaker 2 (15:23):
Five going to give him a consecutive it's going to
be like Greg said, maybe Tom served or something like that.
Speaker 3 (15:30):
He's been in jail for like a over a year now,
so it's.
Speaker 2 (15:38):
Going to get less than five years, John, I mean
more likely five to ten, you know, maybe give him
a long tail. He's not going to get twenty years
for this, you know.
Speaker 1 (15:50):
Is there any way we can hook him to Krasner anyway?
Speaker 2 (15:55):
Grass is probably a couple of parties. Who moves Yeah
there me go to the prosecutor.
Speaker 1 (16:03):
I don't Oh yes, yeah, I think so? Yes, all right, hey,
look not to change the subject, but I got to
do that. Lindy Lou. I was trying to get ahold
of you. We were trying to get ahold of you
because we need a way to contact you, because I
(16:23):
have that exemption for you, that exemption for your is
it your granddaughter or your daughter or grad school? So
if you can reach us via the contact page at
search Warrant podcast dot com. There's a contact page. Just
send us an email, okay, and I'll send you that
(16:45):
that exemption, which will freak the college out and you
won't hear from that. I can. I can tell you
that much. So just send me your email address and
then we'll we'll send the exemption to you. Okay.
Speaker 2 (17:00):
That has to be the next trial all the people
involved in COVID, which in fauci that should have been
with the FBI, and they want us to spend their
time on instead the baby OI. You'll you know taper.
Speaker 1 (17:13):
Yeah, Hey, so Bill, can we bring back I'm sorry
to be disjointed, but we got all kinds of shit happening,
but I want to make sure people are aware of
this Smith Smith anyway, Bill, can you bring up that
website that you had up before? Okay, So this is
(17:37):
a timeline we talked about that Sandra or Birchmore relationship,
whether her alleged killer former. Okay, can you come up
a little bit and we up to that timeline thing. Okay,
So yeah, March twenty ten, birch more than twelve applies
to become a police explorer. March twenty seventh, Matthew Farwell
(18:00):
by the Stoughton Police Department, always an instructor for the
Police Explorers. September twenty eleven. June twenty twelve, Farwell begins
meeting with Birchmore during her freshman year in high school.
Sounds like a anyway. October twenty eighth, Farwell sends Birchmore
(18:20):
a friend request. Farewell gets married May twenty thirteen. April tenth,
twenty thirteen, Firewell as sex with a fifteen year old Birchmore.
Oh God. March twenty seventeen, William Farwell, the twin brother
of Matthew Farwell, becomes a Stoton police officer. April ten,
(18:44):
twenty twenty, Matthew Farrewell Textbert, I'm going to sneak away
from wor quick. I mean, you can see where this
is going, so it should be can you bring that?
Speaker 2 (18:55):
Okay?
Speaker 1 (18:59):
Lots of ten next, I mean if you search, if
you search her name, you'll come up with that, and
you can. But it's it's the next thing to happen.
And a lot of the players in the Karen Reid
case are players in this case, so it should be
a proverbial ship show, right, So I stand by for
(19:24):
another ship show. Okay, so all right, so I wanted
to get to this other ship. Okay, at the Alligator
Alcatraz is open, right, so let's take a look at this.
We've a brief a couple of very brief videos relative
(19:45):
to that.
Speaker 7 (19:45):
Tomorrow, President Trump is expected to be at the opening
of Florida's Alligator Alcatraz. The detention center is going to
get its first detainees as early as tomorrow. What can
you tell us about the people who will betain be detained? A?
Speaker 8 (20:00):
Are they all violent criminals?
Speaker 2 (20:05):
At the start?
Speaker 9 (20:05):
They likely will be individuals who are picked up under
two eighty seven G. These are not going to be
high threat facilities just yet, but they will have the
capacity to hold the worst of the worst unless the
thirteen gang members trend a arragua murders rapist. But it
will be a five thousand bed facility Varney that is
essential to our operational efforts because we do need places
(20:29):
to hold these individuals, these bed spaces before we remove
them from the country.
Speaker 8 (20:34):
So it's not all violent criminals that go.
Speaker 9 (20:37):
There, not likely, but we do need proper facilities, especially
as we're ramping up and really turbo charging those arrests.
Varney and so Secretary Gnome and the Florida leadership there
have done a phenomenal job about rapid speed developing this
location to get as many as five thousand beds, and
(21:00):
we do think that this could be a blueprint for
across the country.
Speaker 7 (21:03):
Do you think it's going to be politically okay to
pick people up in the workplace that may be there
for a few years and then you stick them in.
Speaker 8 (21:12):
The alcatraz in the Everglades? You think that's politically popular
or attractive.
Speaker 9 (21:17):
Barnie, We're focused on getting the worst of the worst
out of this country. As we said before, rapists, murders,
mus thirteen gang members. We actually just had an incident
in Los Angeles where we picked up two Iranians who
were trafficking individuals in this country, likely for terrorist purposes.
Those are who we're focused on getting out of the country,
(21:39):
detaining and removing.
Speaker 8 (21:40):
Okay, Tricia McLaughlin, thanks for join us.
Speaker 2 (21:43):
I appreciate it.
Speaker 1 (21:43):
Thank you Jesus. You know it doesn't he understand that
you got shitheads that need to be there.
Speaker 2 (21:52):
What's all this political nasis. I don't care where you
pick them up from. They didn't care when they came
here about the tens of millions. So and if you remember,
they told these people, let us get him in the courtroom.
That's why the judge is getting what's that Wisconsin where
the judge let them go out the back door. So
if you let us pick him up in the courtroom,
(22:13):
maybe we maybe we're not act the chicken plant or
whatever getting them or the farms. We're getting the worst
and the worst because they're already in court. So for
this guy to sit back and say, oh, you think
it's we don't care about the PC anymore, get the
hell out. You know why you came. It's stop the port.
Speaker 1 (22:35):
Yeah, yeah, Hey, and this video was a big h
This is inside alligator Alcaterras. How about that? Well, I
(23:11):
don't think they have air conditioning right six two degrees?
Speaker 2 (23:15):
Joah. You know Trump got that gold card where you
can spend X amount of dollars and you know, get
that pathway the citizenship right you should get. You know,
there should be a benefit if you can escape from
all it's not PC. We told you we're not PC.
If you can escape from alligator alcatrass from Alcatras, you
(23:36):
should be eligible.
Speaker 1 (23:39):
Oh my god, man, but no vehicles allowed.
Speaker 2 (23:43):
You gotta do it all on foot.
Speaker 1 (23:46):
Yeah, civilization and documented and documented us. You know, we
got to comment that. You know, their accent is going
to be in a jam for if there's a hurricane up.
Speaker 2 (23:58):
Already told you how to run, You got to run
in the zigzag, you got to run in there.
Speaker 1 (24:01):
Yeah, oh my god. All right, okay, so how about this.
Speaker 8 (24:09):
Shipthead game on? He's coming? What do you say that.
Speaker 10 (24:14):
I'm going to be here to stand up and fight back.
And i'mlike the current mayor, I'm not going to be
working alongside the Trump administration to build the single largest
deportation force in American history. I'm going to actually represent
each and every New Yorker, and that includes immigant New Yorkers.
And that means standing up for the laws of this city,
like our sanctuary city policies, which have kept New Yorkers
safe for decades and were defended by Republicans and Democrats
(24:34):
alike for years until we got this mayor who fearmongered
about them so extensively.
Speaker 8 (24:38):
Does that involve the NYPD?
Speaker 10 (24:40):
It means that the NYPD would actually serve New Yorkers
and not assist Ice in their operations. We recently saw
ICE agents.
Speaker 8 (24:47):
So they would not assist ice.
Speaker 10 (24:48):
Not assisting Ice because ultimately their job is public safety.
Speaker 1 (24:56):
All right. So, I, you know, a lot of a
lot of shit is by this Karen read case, and
I somewhat some of it's very amusing. So let's touch
on it just a minute.
Speaker 11 (25:11):
You guys, the jury foreman from the Karen Reid trials
come out to say some shit, and oh my god,
even he sees to the bullshit. Literally, it says costs,
shoddy police work, no justice for victim, no evidence, high cost,
bad police work, all red flags for Jural number one
aka the jury foreman. Halt when I tell you that
(25:32):
he calls out all these people for having privileged Oh
my god, Oh.
Speaker 8 (25:36):
I love this for them.
Speaker 2 (25:37):
Okay.
Speaker 11 (25:37):
The jury foreman stated that there was no aha moment
in the grueling case that he said wasted the taxpayer's
money on paper thin and evidence that failed to bring
any comfort to the victim's family.
Speaker 2 (25:51):
He's coming for blood.
Speaker 11 (25:52):
The unforgivable fact that the investigators didn't swarm the house
at the murder scene on thirty four Fair of Your
Road in Canton during a northeaster was shoddy work and
a blatant red flag. The foreman added, that's why the
only guilty finding was drunken driving. If that body was
on my front steps, I know my house would have
(26:13):
been stormed. The juror, who was black and grew up
in JP. Hell yeah, said the text from Trooper Proctor
bitch and it's probably wack job cunts also showed a
serious bias. Yet I had to put my personal opinion aside.
Oh my god, he's not lying like he fucking He
(26:34):
goes on to say, it all seemed like a wishy
washy privilege we had.
Speaker 1 (26:41):
Yeah, so let's look at the front yard at this place.
I'man you know, Jesus they knew.
Speaker 2 (26:47):
The entire time.
Speaker 11 (26:54):
Sit This video was taken inside thirty four fare of
(27:14):
your Road when Karen Reid's legal team was able to
go inside the home and do their own investigation. The
purpose of this test was to see could you hear
what was going on outside of the home near the
flagpole where John O'Keefe was found while being inside Brian
Albert in Nicole Albert's bedroom. That's Alan Jackson, by the way.
(27:35):
This video was shot from Brian Nicole's old bedroom, which
are these two windows right here? And if you don't remember,
John O'Keefe was found right here, which means when Brian
Albert and Nicole Albert first testified in trial one, you
had to believe that they were in such a cold
sleep that they couldn't hear what was going on right outside.
Speaker 1 (27:57):
Okay, so that's weird, and that beyond weird. So how
about the basement. What's the basement look like?
Speaker 11 (28:08):
New video was just released from the basement of thirty
four Fair of Your Road, when Karen Reid's legal team
was able to go into the house and do a
little viewing. This area of the home was of particular speculation,
specifically during TIA one of where John O'Keefe went the
last moment of his life on January twenty ninth, twenty twenty.
As a reminder, these are the only two images that
(28:30):
we have seen publicly of the basement of thirty four
Fair of You Road. But now we have video footed.
Speaker 10 (28:41):
And once you get up with the inside that room
is some some floorda flywood?
Speaker 12 (28:47):
What is that?
Speaker 2 (28:48):
Is it? Or be low super No, there's no floor.
Speaker 10 (28:54):
Basically, every single part of this part of the basement of.
Speaker 3 (28:57):
This part of the house.
Speaker 13 (28:58):
The floor has been taken up.
Speaker 8 (29:00):
Yeah, I mean all of this it's just hodgepodge, right.
Speaker 10 (29:03):
It just looks like he was thrown down so that
the actual floor was ripped up, and this is just
a hodgepodge of plywood all the.
Speaker 1 (29:09):
Way through here. Ye.
Speaker 8 (29:11):
Yeah, the entire floor was ripped gripped up.
Speaker 3 (29:18):
And then this goes into a storage area.
Speaker 8 (29:20):
It looks like this.
Speaker 14 (29:21):
Area did not have flooring in it to begin with.
Speaker 1 (29:23):
But all this is relatively.
Speaker 3 (29:27):
It looks to be relatively reaches plywood.
Speaker 11 (29:30):
That's the bulkhead door right there.
Speaker 2 (29:32):
That door.
Speaker 11 (29:36):
The stairs are.
Speaker 12 (29:36):
Right you getting enough light to actually get this Yeah.
Speaker 11 (29:40):
Okay, okay, reviewing where Alan Jackson just took all of us.
He came down the basement stairs right here, so this
is where the video was first started. And then you
see them walking through all of the storage they went
into here he showed you so this little storage compartment.
(30:02):
They didn't show you the bathroom. And then magically, of
course I've mentioned this before, the bulkhead door going outside
is not actually on this layout. That is a window,
but they didn't actually list the bulkhead outside door. That's
a theory we can talk about another day, because I
specifically remember this exact photo being shown during Trial one
(30:25):
during Brian Albert's cross examination, and Brian Albert was like
almost shocked that they even had this photo. They had
no idea that Karen and her team were able to
go into thirty four fair View and this door is
the original pine door to the house. I find it
very interesting that the photos that they used from when
they sold thirty four fair of your Road for when
(30:45):
the Albert sold the home, it showed these images so
you can clearly see that that is carpet and then
this right here is like that puzzle piece type of
gym floor. And you can argue, sure, pulling up the
dirty Jim flooring before a new homeowner comes in, sure,
But to pull up an entire carpet is weird. Like,
(31:06):
I don't understand why they would have spent the money
to put in this carpet just to rip it up
before they sold it. And as a reminder, it was
John O'Keefe's cell phone data that stated he did three
flights of stairs around the twelve twenty am time frame,
the same time frame Colin Albert testify that he did
the same flights of stairs out, just something to ponder.
Speaker 1 (31:33):
So they can hear people from the outside, and they
removed all the basement flooring. Thoughts.
Speaker 2 (31:44):
You know, some people alive. I still believe that parr
Read is guilty of me.
Speaker 1 (31:48):
This, Yeah, I know, probably awesome.
Speaker 15 (31:53):
I mean the fact that you would do that at
that time, you would have to think that people are
going to think you look kind of thinky on that.
So so regardless of what whether it was a righteous
deal or not, you had to know as a cop
that that's probably not a good idea to do that.
Speaker 1 (32:12):
Yeah. So somebody analyzed the uh I think it was
the ABC. Well, it's commentary regarding the ABC interview of
the knuckleheads that were in the house and I referred
to as a whiplash. Let's just look at this quick.
Speaker 16 (32:29):
Did you hear about the mccabs getting treated for whiplash today? Yeah,
after they broke their freaking necks. When this guy Brian
Albert decides to say, yeah, the cops were in my
house day one. They were in the house right away
in the morning. Wait, what when John was already gone? Wait,
(32:50):
how did you know he was already gone. These two
turn their necks so fast when he said it, and
then look at Chris like, oh crap, we're screwed his wife.
Speaker 8 (33:03):
Not unbelievable.
Speaker 16 (33:06):
This picture could say a thousand is worth a thousand words, unbelievable.
Speaker 17 (33:12):
By the time I came downstairs, the police were already
in my house.
Speaker 2 (33:17):
John was already gone. There was nobody to save. The
police were already in my house. John was already gone.
There was nobody to save.
Speaker 1 (33:31):
What Jesus christ Man, what do you think of that? Jesus? Okay,
so remember this guy. I won't run the whole thing,
but remember this guy from before, from last week. I
(33:52):
think we had him last week, right, you.
Speaker 17 (33:53):
Know, I'm just seeing this Nightline interview with the mcalbert
from last night. Oh my god, dude, these idiots thought
this was gonna make them look more innocent.
Speaker 1 (34:04):
Bro, these motherfuckers are so much more guilty.
Speaker 17 (34:07):
Where Ryan Albert goes, Jen McCay burst in my room
and says, John, I think I think John is dead.
Speaker 1 (34:18):
Out in front of your house.
Speaker 18 (34:20):
And I said, what the fuck is John doing in
front of my house?
Speaker 1 (34:23):
What would John be doing in front of my house?
I don't know, dude, maybe you.
Speaker 3 (34:28):
Guys were drinking last night at the bar.
Speaker 1 (34:29):
You guys invited him back.
Speaker 17 (34:31):
Karen and every single person in this family supposedly saw
Karen's car, which had John in it, in front of
your house moving multiple times. Maybe because Jen mccab texted
John about beating at the house.
Speaker 2 (34:46):
Hey are you here?
Speaker 1 (34:47):
Where are you? Called him seven fucking times?
Speaker 17 (34:49):
Maybe because Higgins the bowling ball said that he fucking.
Speaker 18 (34:53):
He also texted John. It was anticipating him being there.
Maybe that's why he'd being in front of your house
because everybody was fucking coming there. What do you mean,
why the fuck is he in front.
Speaker 1 (35:04):
Of my house?
Speaker 18 (35:05):
You guys never heard from him, supposedly like all this
fucking dude, If I've learned anything about being a conservative
and being a Republican wherever you want to call it
being on the right of politics, is that TV is
fucking bullshit.
Speaker 12 (35:19):
And the stuff you see on the news is bullshit.
Speaker 1 (35:23):
So it's like, yeah, yeah, exactly. Oh god. So anyway, Also,
the ring camera at Karen Reed's and John's house, uh
is conveniently missing?
Speaker 2 (35:45):
Wow, I mean it's so when you moved it, where
did it go?
Speaker 1 (35:51):
Uh? Ah? I mean something about hang on a minute. Wait,
oh here, I'll just run this really briefly.
Speaker 8 (36:07):
So one of the mysteries that still remains about this
case is missing ring videos from John O'Keefe's home.
Speaker 19 (36:15):
There's ring camera video from like eleven thirty the night
before on the twenty eighth, when Michael Camerono comes to
the house to pick up his daughter, and there's ring
camera video the next time at five oh seven am,
when Karen Reid is leaving. If you do natural math,
there's a point in time when Karen Reid comes home
(36:36):
and puts her car in the garage in between those
two time periods, but there's no ring camera footage of that.
Speaker 20 (36:43):
We know the ring video had to be deleted because
at twelve forty one, I leave a voicemail to John
where you can clearly hear me pull into John's garage.
You can hear my car sensors beeping as I come
close to the to the sides of the garage drug
and then you hear the cargo off, and then you
hear that the sled thing as I opened the door,
(37:04):
shut the door. I had high heels on that night.
You hear me walk in, go in through the stairs
up into the kitchen, shut the door, and you hear
me walk into the kitchen. It's like a full minute long.
That's at twelve forty one. But the ring video of
me arriving home is mysteriously missing. The prosecution is accusing
me of deleting the ring, and we're accusing them.
Speaker 1 (37:27):
The Okey family believes that when they did this was
when Karen got home from the hospital with her dad.
Speaker 21 (37:35):
There was certainly some evidence in this trial that Karen
Reid got home and had gone up to John O'Keefe's bedroom.
Speaker 2 (37:45):
And if you know about how long were they in the.
Speaker 4 (37:47):
Upstairs area the house, I would say ten to fifteen minutes.
Speaker 21 (37:52):
Karen Reid had access to the computer in the bedroom,
so there's a possibility or an inference that she would
have a deleted ring footage that she didn't believe was
favorable to her.
Speaker 18 (38:05):
So this would have been the time that miss read
the defendant arrived with her family from the hospital as well.
Speaker 20 (38:10):
Correct Why am I deleting these videos? If anything, I
mean I want the video when I arrive home to
show that my tail light's intact. I don't know that
it would have captured it, but it could have. They
wouldn't want that video in because it would show my
tail lights intact. So who has a greater incentive to
delete the video?
Speaker 2 (38:38):
Thought it definitely wild be a movie?
Speaker 3 (38:44):
Oh yeah, for sure.
Speaker 1 (38:46):
Yeah. I mean if her tail light is intact and
they you know, well.
Speaker 2 (38:53):
She's making sense. I mean, why would she delete it
if it's showing her getting home and not unless she's
getting home in the tail lights not intech? But she
seems to be sure that a tail light is intech. Right,
And there's no the last I heard, there was no
broken tail light on the scene where it was allegedly
(39:17):
hit and ran over by here.
Speaker 1 (39:18):
Read just you. I mean, there's so many problems with this.
It'll be interesting to see that in this next traw
with that little girl, that that teenager that was you know.
Speaker 2 (39:37):
His credibility is already shot. If his credibility is chargers
can't retrial. Now you got a twelve year old who's
getting statute shortly raped by one two three, who knows
how many cops? Yeah, and did the baby survive? John
or is did the baby? Is the baby also deceased?
Speaker 1 (39:57):
I don't think so. I don't because it looks like
it was a.
Speaker 2 (40:00):
Little bit far enough along if that lasted.
Speaker 1 (40:04):
Yeah, okay, So I want to just get into this
Coburger thing. Hang on a minute, Okay, this is uh.
I think this is an attorney, a young attorney, that is,
I'm not going to run this whole thing. You get
an idea of you know, his his thinking, because I'd
(40:25):
like to get your guys idea. So Coburger pled guilty, uh,
with a life intended life sentence instead of the death bounty. Right,
So he's reacting to that.
Speaker 12 (40:42):
All right, So Brian Coberger enters a plea. H. This
was a surprise to me.
Speaker 8 (40:47):
A case with this.
Speaker 12 (40:48):
Much notoriety and this much at stake, I would have
anticipated it ended in a child. But Coburger's team and
Coburger decided that he could spare.
Speaker 3 (40:58):
His life if he goes this route.
Speaker 12 (40:59):
And apparently that was a real concern for him. It
had to be because the deal has to be for
life in prison.
Speaker 3 (41:06):
I would be shocked if it was for anything else.
Speaker 12 (41:09):
I haven't seen the deal, but that seems like the
only bargaining chip that there could have been is if
we forgo trial, you won't seek the death penalty and
hell out.
Speaker 3 (41:19):
Of life for the prison for the rest of his life.
Speaker 12 (41:23):
So the reason why I'm so surprised is a case
like this, it's maybe open and shut.
Speaker 3 (41:31):
I don't know. I don't know the evidence.
Speaker 12 (41:33):
Other than what I've read and what I have seen
on the news. It seemed like it was getting better
and better. The last things I read is there were
eyewitnesses that saw Coburger at the house that night. And
if you can put him at that house that night
with eyewitness testimony, in my opinion, you're cooked. I mean,
there's no way there's other circumstantial evidence the points to him.
(41:54):
So I think that would have been the nail on
the coffin quite frankly. However, even if he was convicted
at the trial, it would have to go to the
death phase, and at the death phase is where Jerry
would determine whether or not he should be put to
or sentenced to death right, So there's a lot of
mitigation that comes in in that phase that his lawyers.
Speaker 1 (42:12):
Wouldn't have been able to argue and use.
Speaker 12 (42:13):
But again, all that is out the window. It doesn't
matter anymore because he has entered a plea. I read
a lot that the families are upset about this based
on the articles that I've read. They had loose conversations
with the prosecuting attorneys on Friday about a potential plea.
I guess that they were all totally against any sort
(42:35):
of plea, but it seemed like it was very preliminary,
that the prosecutors didn't think it would happen, and then
a deal was struck. It sounds like Sunday and Sunday
Monday the news broke that he had entered a plea.
So a lot of people are asking, how can he
enter a plea? How can they make that offer.
Speaker 3 (42:50):
Without consulting with the families. Well, they did consult with
the families.
Speaker 12 (42:54):
Ultimately, it's not their decision. The prosecuting attorney ultimately would
make the call whether or not they are going to
offer a deal such as the one that has been
offered to Coburger and apparently accepted. So while victims do
have rights, they're not going to be able to usurp
a decision from the prosecuting attorney, and who knows what
that was for. I mean, as open and shut as
(43:16):
this case may seem, they may believe that there are
certain loopholes that Coberger could have hung a jury that
could have gotten in acquittal, so it may be a
little bit of a risk analysis on their part as well.
I mean to say that putting somebody in prison for
life is not a win, I think would really be
undermining what that truly is. It's a long time, he's
(43:40):
never getting out. Yeah, he's not going to die. But
some people may say I'd rather die than spend the
rest of my life behind bars.
Speaker 1 (43:47):
So who knows.
Speaker 12 (43:48):
But it's definitely been a polarizing legal issue in the news.
I think, right or wrong, people are looking forward to
the trial. One of the caveats are the plea deal
appears to be he doesn't have to explain himself or
talk about what occurred, and that's actually I don't even
(44:09):
know that that's really a deal. So in any scenario,
a criminal defendant.
Speaker 1 (44:15):
Doesn't have to speak.
Speaker 12 (44:16):
He can just sit there and not say a word.
A lot of the times, the reason why you want
somebody to speak is for mitigation purposes. Well, if he's
going to prison for life, what's he going to say.
Speaker 3 (44:25):
There's nothing to say.
Speaker 12 (44:27):
He can't say anything to help him reduce the sentence,
So he's probably not going to say anything, and he's
going to get sentenced. Then you know, the general public
will continue to wonder why this occurred, unless sometime down
the road Coburger writes a book, or family members write
a book, or his lawyers write a book.
Speaker 3 (44:46):
Who knows.
Speaker 12 (44:46):
But other than that, it seems like the truth or
as to why it occurred will die with Coburger unless
he decides that he's going to talk about it one day, that.
Speaker 3 (45:03):
This is a philosophical discussion that you know, I've had
many times over the years between justice and vengeance. You know,
justice will always leave you hollow. It leaves the victims hollow.
It leaves the victims family hollow. You know, we want
his vengeance. And so you have these four college students
who were murdered, and the families definitely have a steak
(45:24):
and personally connected with the steak and what happens. And
I got to read some of their comments yesterday. And
I've followed this case loosely over the years because while
they happened in Moscow, Idaho, Coburg is from Pennsylvania, and
so there was there was quite a bit of Pennsylvania
(45:46):
stuff going on with this case.
Speaker 1 (45:51):
Jake, what do you think.
Speaker 2 (45:53):
I think Idaho has the fireing squad for the death penalty. Yeah,
they might have wanted to see that. I understand that
attorney's argument. Now, the only problem that could possibly come
out if the judge doesn't accept because the life sentences
the way I understand that are all consecutive, not concurrent,
(46:14):
and the burglary is also consecutive. So he's definitely never
getting out. Yeah, but the judge can throw a Mulchi
rinch in it, you know instantly. Well, I think he
is the only entitled to twenty five to fifty. I mean,
we could, we could be crazy, but you got the
Melindaz brothers. Now, who's puntsman could be getting out of jail,
(46:38):
so you know they blew their mother's had.
Speaker 1 (46:42):
All Lindy Lou Greg you got to read this. Yeah,
that's exactly what we need. Replace the current system with
a ve system. Yeah, great idea.
Speaker 3 (46:55):
Yeah, it seems like that the social contract that we
all agreed to or more born into, its changing every day.
So maybe we should go to.
Speaker 2 (47:02):
That consequences for your actions, and I think, you know,
I think what the family believes is there's no consequence.
When you got to sit there and hear that your
little girl who stabbed at death, and the next little
girl was stabbed at death, and your next little girl
(47:22):
was stabbed to death, and your son was stabbed at
death viciously and violently. You want that to occur? And
I know we all sit here. Yeah, I know I
wanted to occur, if not by my own hands. So
I can't blame the family and I can't blame institute
and the vengeance system. Yeah.
Speaker 1 (47:46):
Yeah, for sure. That guy is creepy looking, though, is
he not?
Speaker 8 (47:51):
Yes?
Speaker 2 (47:52):
Well that was the other thing that said, the Bundany fantasy.
Speaker 1 (47:56):
Yeah, yeah, all right, Oh that's right. Coburger is still
in his Bundy fantasy, and he still thinks he's going
to smart the system. This isn't the end of him. Yes, correct,
So has he? I take it he's signed a pleaan deal.
Speaker 2 (48:17):
Right. It happened right before the show, Yes, right before
our show started.
Speaker 1 (48:24):
Okay, well, so, but the parents, the one a group
of parents has pissed off about this please agreement. I mean,
how do you guys really feel about that? I mean,
they weren't consulted or anything.
Speaker 3 (48:43):
I think it'd be easier to understand if they actually
gave you the reasons why they want to plead. Why
why does the prosecution why does the prosecution want to
accept the plea? Well, what's the reasonings behind it?
Speaker 1 (48:54):
Well, there's no risk, absolutely zero.
Speaker 3 (48:56):
Risk right after I mean, and is it money involved?
Is are they worried that it's going to bleed the
I don't know whether it's a state or the county
dry I don't, I don't, you know, those kind of things.
I haven't really seen a whole lot of reporting on that.
I you know, I personally, I mean, just who cares
what I think? But I would like to see it
(49:17):
go to trial. I mean, I'm hoping that the judge,
you know, weighs in on it as well. Yes, screw this,
we're not We're not doing this. Yeah, Hey, I wanted
to this is you know, it's disjointed. Okay, So I
wanted to bring this up though, you know, But that
socialist in New York City, the you know, Democratic.
Speaker 1 (49:43):
Primary winner for the mayor, apparently he tweeted this. This
is verified to be true, and check the date. This
piece of ship tweeted this. When the challenger exploded New
York's IDI has cooked. So apparently Zoe Ran tweeted lmao.
(50:07):
The shuttle just exploded. Each shit, America's unbelievable.
Speaker 2 (50:17):
I mean, we've b open our boarder to terrorists. I mean,
I believe Omar is a terrist. This guy as a terrorist.
In my opinion, How New york Is could elect this guy,
how could we see one vote other than his own,
is beyond me.
Speaker 3 (50:34):
Hey, we got serious problems coming up. We got these
young people that are now part of Antifa, and they
know everything, and they've been educated by TikTok and Instagram
reels and they didn't get taught anything in school. You know,
my son, I one of the reasons why I moved
to Tennessee was I wanted to give him a good education.
He was going to his freshman You're a high school,
(50:55):
and I thought, well, the class sizes in Tennessee, you
are small. Well, COVID hit he only did a year
and a half of a four year program. Oh jeez,
graduated really because he had the credits, so he so
he did about a year and a half of four
years of high school. He's not the only one. So
we have we have a whole generation of people that
(51:16):
they doubt the nine to eleven happened. You know, who
did you know, there's there's all these conspiracy theories that
and I and not all of them are bad bad theories.
Is a little bit of pieces of truth to everything.
But they've been educated by TikTok and Instagram and other platforms,
and they don't get taught fucking civics in school.
Speaker 1 (51:36):
They don't get.
Speaker 3 (51:39):
You know, when we moved to Rhode Island, my son
grew up in elementary school Rosa Parks Elementary in Virginia.
I take him to New England. He thinks he's going
to get some good history classes. No, he got the
non violence approach to education. So he didn't get taught
this enough. So I wonder the public. You know, it's
(52:00):
my fault as a parent. I probably should have did
a homeschool thing or put him into a private school.
He's a smart kid. He figured it all out on
his own. He's pretty well read. And but there's there's
a lot of kids that aren't. And so these are
the people that are going to vote for this guy.
You know about over our years, John, I mean when
we did a security clearance, they used to ask us
(52:22):
if we were communists. They don't have anymore. I asked
you if you a part of a group that's going
to overflow the US government. Well, this guy, this guy
has some serious problems. One, he's I think he's probably
more not a socialist, he's more a communist. And two,
I think he's bent on Sharia law, which would be
very interesting to enforce in New York City. But there
(52:46):
are parts of New York City that are patrolled by
Chinese cops, so maybe maybe it will work in New
York City.
Speaker 1 (52:53):
Okay, so you've got you guys have seen escape from
New York right. Oh yeah, So isn't that the the answer?
You just put a big giant fence up around it
and let them kill each other and end it.
Speaker 2 (53:10):
Well, Russell might agree to that, but yeah, you're going
to have a mayor who's going to have access a
terrorist maker who's going to have access to intelligence.
Speaker 3 (53:25):
We can't have the number one police department, the biggest
police department in the country, that has paramilitary type. They
have swat teams, they have their own as Jake just said,
they have their own intelligence network. You know, it's it's
it's I can't think that there would be a National
Guard unit anywhere in the country. They'd be proud to
(53:48):
have NYPD, and there and there and there the money
that they have behind them, so and the toys that
they have us. I'm trying to think of better words,
but I'll use the word toys right right. He's going
to have that at his disposal to either crash it
or or levey it, you know, to use it.
Speaker 1 (54:10):
Well, I think speaking for the rest of New York State,
I mean, can we just cut it off and let
it float out to see? Yeah, I mean that's you know, uh.
Speaker 2 (54:23):
Jo in mon Dommy, I mean that is scary stuff
there and brag and Latisa James hell are y'all doing
in that state?
Speaker 1 (54:36):
I mean, that's a bad that's a bad combination. That's
a really bad combination. Hopefully Curtis Leewalk and uh pull
it off right.
Speaker 2 (54:49):
Not a bitty bill of baby Well.
Speaker 1 (54:56):
I gotta tell you, I'm I'm afraid that well a
lot of peop So apparently Trump came out with a
thing this morning that says he won't let that happen.
(55:17):
So we'll have to see.
Speaker 2 (55:18):
Gotta be about another thirty lawsuits, you know, Katanji Brown Jackson,
so to Major and Kaking doing their nonsense. Nos, Yeah,
that wasn't the right. That was education.
Speaker 1 (55:36):
Yeah, that was that. I'm just there. Okay, we got
a break. Hey, Look so we're going to be right
back with special with our special guests, Christinelyn Harvey, uh,
the editor and chief of New Living News Independent Media.
We're going to talk about artificial intelligence, more specifically artificial
(55:57):
intelligence and healthcare could really mess you up. So stay
where you're at and we'll be right back. Okay, Okay,
we're back. Hey. Uh so I'll let you know who
we've got. We've got Detective Jack Jacobs from from Philadelphia
on the side and offer involved shooting unit. And we've
got our correspondent from Tennessee, veteran n c I A
(56:19):
special agent, Greig Islands. And I'm John Stanton, a veteran
n c I S special agent. So we have a
special guest this uh this hour, but I wanted to
touch on this, uh Bill, can you show website number
one on ED please? Okay, So if you've watched this
(56:44):
at all, you know that we have a we've had
a special guest attorney and he's uh, it's just awesome.
And he was intricately involved in this this lawsuit against
CBS which is forcing them to pay an eight figure
(57:09):
amount to change our editorial policy in a settlement with
the President Trump. I'm hesitant to give you his name
because he gets a lot of hate mail. So, but
you know who he is. Ed Okay, so he's doing
a hell of a job.
Speaker 2 (57:29):
You know what's funny John with that is they said
we didn't have to say we were sorry publicly, but
you got to give up thirty billion dollars. I'm sorry.
Speaker 1 (57:44):
Oh my god, Yeah, exactly. Okay, So I just want
to touch base on the you know, one other two
other issues. One you know, we've already gone through this Canton,
Massachusetts where they have a large corruption problem of epic proportion.
(58:09):
You know, it doesn't mean to say that. Uh you know,
I don't think it just stops in Canton. It stops
with uh you know it uh negatively impacted uh uh
Sergeant Infinity. Uh you know, uh and this is Sergeant
Thority with his two younger daughters and uh his older
(58:34):
daughter and Bill, can you bring up number four infinities
GIF send go. You know, he was wrongfully prosecuted for
a totally bullshit thing. And when you look into it,
and we've covered it, uh, it's bad and it's indicative
of what the hell's going on up there. So you know,
(58:57):
take a look at our previous episodes or we were
talking about wrongful prosecutions, uh, because Dave Ennerdy needs uh,
you know, to turn this around because he was wrongfully prosecuted. Okay,
same thing with uh Lacey lun Brad Lunsford. The uh yeah,
(59:18):
after Brad Lunsford was you know, wrongfully prosecuted by a
Zoros installed Attorney general in New Mexico and police, uh,
you know, contribute if you can, because they really have to. Uh.
(59:38):
You know I mentioned before that Lacey. Uh you know,
it was kind of enough to give a speech in
front of the FOP, but you know, Brad was wrongfully
uh prosecuted, and uh these Ros installed attorney general is
still pissing around with the motions and stuff, trying to
save face because you know where there was a major
(01:00:01):
problem and that with that case, particularly that yours, right.
Speaker 2 (01:00:06):
You know, the progressive of whatever you call them, reform
DA's and prosecutors. You know every law enforces offer tell
you the one thing we hate the most is a
dirty or corrupt cop.
Speaker 1 (01:00:20):
We don't get exactly exactly I hear you.
Speaker 2 (01:00:23):
That law enforcement officers hate bad law enforcement officers like
the ones you're a legend and can't and the worst.
You know, So we don't need the progressive prosecutor. We
police ourselves to a certain degree that we would get
rid of them, and we will tell the progressive prosecutor
to get rid of this piece of crap if we
need so.
Speaker 1 (01:00:42):
Right, this is uh, I mean, this was real, particularly recent.
I didn't show We didn't show this last week. I
wanted to show it, but we ran out of time. Anyway,
we received confirmation and our request has been referred to
the Special Litigation set of the Civil Rights Division. So
(01:01:04):
the National Police Association requests Federal the civil rights investigation
of New Mexico Attorney General Raul Torres. He's the guy
that wrongfully was involved in the wrongful prosecution of Brad Luncerd.
And we've been through that and you can see the
the previous episodes that we had on Officer Brad Lunceford
(01:01:27):
and it's the whole case is bullshit. You know, the
the subject killed himself. Okay, that's it. That's it. That's
the summary. Subject killed himself. Infinnerty.
Speaker 14 (01:01:39):
Uh.
Speaker 1 (01:01:40):
You know they prosecuted him so they could give the
people that they could satisfy an agenda and give immunity
the people that actually were responsible for the problem. So
I don't know what the hell you're doing up there
in Boston, but it's getting worse and it's getting a
(01:02:02):
lot of attention. So get your shit together. You know,
if you want to tell us how great you are
up there in Canton especially, you know, give us a call.
We'd be happy to have you come. On the same
thing with Leah Foley. Uh, the interim US attorney, why
did you wrongfully prosecute Sergeant infinnerity And for that matter,
(01:02:26):
Raoul Torres explained its what hell was going on? I mean,
that case is so host up, it's.
Speaker 2 (01:02:36):
Unbelievad of you know, the one guy we had on
earlier about with young Karen Reid stuff, you know, you
say on the right and everything. You know, we take
so much time on this side to get something going on. Yeah, well,
they would already had us locked up in jail and sentenced,
and we're sitting back saying, well we still need this
paper clip. Enough enough nonsense.
Speaker 1 (01:02:59):
Yeah, oh god, oh mighty boy. Okay, So our special
guest is Christine Linn Harvey. She's the editor in chief
of New Living News and Independent Media. Bill, can you
bring Christina, Christine? How you doing? How are you good?
(01:03:21):
How are you doing good?
Speaker 2 (01:03:23):
So?
Speaker 1 (01:03:24):
I know we want to talk about artificial intelligence specifically
geared towards healthcare.
Speaker 14 (01:03:31):
Right, well, I want to talk about it as a
broad issue, Okay, it has not only health concerns but
also social issues, basically the future of humanity. Agress on
SNE technology. Yeah, I don't know if most people have
(01:03:56):
had experience playing around with things like article artificial intelligence
like chat GPT and things like that, and it can
be a tool, it can help you. I find it
useful to brainstorm ideas, to kind of flesh out concepts,
(01:04:17):
to research various topics, because it's way better than Google. Actually,
it will just keep on giving you more and more information.
But at the same time, if you ever notice these
chat GPTs and brocks and all these other artificial intelligent
(01:04:39):
apps that you can download, if you notice they often
always give you positive feedback. So if you come up
with an idea and say, let's work on this idea today,
this is my idea, then the chat GPT will come
back to you and say, oh, that's a wonderful idea.
It just gives you, you know, so you feel like
(01:04:59):
in inspired and that's a positive thing. But what they've
been what's been happening. People are actually becoming mentally ill
from using these apps in the sense that they rely
on the apps too much. The apps almost become human,
like a human companion. And you can tell this chatchpt
(01:05:23):
everything about yourself. You can tell your whole life story
and it will remember everything. So come back to you
later on and say, oh, you remember that time you
told me this about that. Well, yeah, let's talk about
you know. So there's this weird there's this weird connection
that happens between this artificial intelligence and you as a
(01:05:49):
human being. So if you're vulnerable to begin with, say
you were abused as a child and now you're in
a marriage that where you're abused and you just want
to find some escape, in some relief, so you go
to this, You go to this artificial intelligence and you
start trying to have a relationship with it. And what
(01:06:12):
has happened with people Some people who have been using this,
they have a break with reality. They become psychotic, they
become paranoid, there's a good article at futurism dot com
about this happening beginning to happen. But you're not going
to hear about it for a while because it's going
(01:06:33):
to take a while before it gets its way into
mainstream media as well.
Speaker 2 (01:06:39):
You know.
Speaker 1 (01:06:39):
One of the problems, I mean, just I'm just going
to give you an example. Chat Chat GPT. If you
ask it a question, it's going to give you one result,
and if you ask GROC another the same question, they
have different responses.
Speaker 2 (01:07:01):
Uh.
Speaker 1 (01:07:01):
And I'll just give you an example. Not to get
heavy into this, but it's just an example with the
Sandusky case. With Chap g Chat GPT, it doesn't tell
you the real story. GROC tells you the story.
Speaker 14 (01:07:15):
Definitely censored because a couple of times I tried to
use it for something, coming up with an image or
a logo or something, and it refused to help me
because it said it didn't fit. It's so called guidelines.
Speaker 1 (01:07:31):
You know, Greg, you have a visitor.
Speaker 3 (01:07:36):
That's that's the I think it's called a sheba.
Speaker 1 (01:07:41):
Okay, it's not an AI generated sheba.
Speaker 3 (01:07:45):
She's much smarter than anybody that I know.
Speaker 1 (01:07:48):
Okay, Okay, great, Yeah, that I mean AI is okay.
So we just talked about I just want to kind
of mention this. We talked about Tennessee enacting what one
hundred and seventy seven new laws, right, one of which
is that kids can't have their phones in their classroom.
(01:08:11):
So they're not chat, gpting or grocking, right.
Speaker 14 (01:08:19):
And they shouldn't because the whole purpose you see that
this is the other worries about artificial intelligence is that
it's going to make us soft, meaning intellectually critical thinking
skills wise we're going to have. We're going to get soft.
Our brains are going to turn to.
Speaker 1 (01:08:39):
Joe and I think that happened about five years ago, right,
I agree.
Speaker 2 (01:08:45):
I think they're already making us soft. You know, we're
importing workers, you know stuff. I used to go to
the fields and pick blueberries. Now they're saying you got
to be an illegal immigrant on an immigrant to pick blueberry.
I did it when I was a kid. So you know,
we're already getting slow. You know, media.
Speaker 1 (01:09:02):
Ddy Lou right on the money, Jesus squeeze.
Speaker 14 (01:09:12):
I guess there's a fine line between you know, innovation growth,
you know, boosting the economy and also not letting corporations
do whatever the hell they want. There there has to
be some regulation. And speaking to this topic, the current
(01:09:34):
administration has rushed to lift any kind of restrictions on
artificial intelligence whatsoever. So I don't think that's necessarily a
good thing. Uh. In the race to become the first
in the country, I mean in the world with artificial intelligence.
When we find out that Facebook had actually been giving China, uh,
(01:10:01):
I don't want to say the technology, but somehow they
got because Zuckerberg wanted to break break into China so badly.
He also married a Chinese American woman, so he's got
a China thing. He's got this China. He wanted the
president of China to name his first born, which the
(01:10:21):
prime minister that China refused to do so. Anyway, in
order to get into China, what he did was and
there was a big there was a book written on this,
a big hearing in Washington. But the author of that
book was the whistleblower that blew the little off this
whole thing. And you know how we were rushing to
(01:10:43):
get the AI out there, and then all of a
sudden China had it. This was like a couple of
months ago. Remember the news where China came out with
the artificial intelligence first. I think it was called deep
Seek or something. Yes, yes, okay, Well that's because of Facebook.
(01:11:03):
So these corporations have no allegiance to their country that
they were you know that they started in like these
companies who started here in America, they have no loyalty
at all.
Speaker 1 (01:11:22):
Well, you're saying that they are for AI to be
developed in the US, is that right?
Speaker 14 (01:11:34):
Well, we were on the cusp of getting it out
there and chying to beat us to it for this
news not that long ago. It's amazing.
Speaker 3 (01:11:47):
China. China beat us for the commercial commercial use of
AI because we we had been using AI in the
government for a while.
Speaker 1 (01:11:57):
Now.
Speaker 14 (01:11:58):
Well, of course we have given them military version of AI,
but they.
Speaker 3 (01:12:07):
Still Internet the Internet was was was a US military
use that we then exported to the commercial world AI
as well. I mean, it's you're talking about giving the
ability for the computer to think for itself and make decisions.
(01:12:28):
So I'm sorry to interrupt, I mean I just for
my own personal knowledge, is that what you're talking about?
Like chat, GPT, those type of things are.
Speaker 14 (01:12:38):
Well, yes, the Internet. The Internet is an example of
artificial intelligence when it makes recommendations to you, when it
picks certain elements that you're looking for, it will make
that decision for you. It's not the picture. So again
we're being limited by the ability to make choices, to
(01:12:59):
make educated informed choices because we're being fed only certain
kinds of information in sand And the problem with this
is is that people who come from Silicon Valley and yes,
addressing your point before, Yes, the US military did develop
(01:13:22):
the Internet and then apply to commercial lifts that we
could all use it. And that's been the way with
a lot of a lot of innovation and a lot
of things that have been you know, a lot of
medical innovations, things like that. So yes, you just take
(01:13:42):
these dollars have paid for this research and now we
get to enjoy it. But also the corporations are getting
to enjoy the benefits, the foods of the taxpayers' money
as well. And the problem is is the people in
Silicon Valley. There's a couple of them that you really
need to keep an eye on because their philosophy is
(01:14:03):
quite dangerous. They do believe in a future of technocrats,
and in that future that does not include democracy. The
people like Peter Tiel. You know who Peter Tiel is, Yes, okay,
so he will paypals right. Then he developed this other company,
(01:14:26):
which is I think it is a very dangerous company.
It's all Talenteer, and they have a lot of contracts
with the military, but they also have contracts with other
countries militaries and police force. And so what they're doing
now is they are creating databases on each one of
(01:14:48):
us based on all the information that they can get,
all our tax records, medical records, purchasing history, internet searches,
comments on social media. They have you all wrapped up
in one package. They know exactly what you are, and
they can filter out this information and go after certain
(01:15:13):
people that they don't like. That's how they were. Whatever
your views on the immigration, I don't agree with people
coming over the border that are not leak and the
way she needs to be fixed. But at the same time,
the reason why they were able to so quickly round
up a lot of the protesters about protesting what's going
(01:15:38):
on in Gaza is because this company, Talenteer, put together
that it took information from social media, people's posts, their activities,
their movements. They can even track you on your cell phone.
They know exactly where you live. They can look at
the GPS tracking on your phone and know exactly where
(01:16:02):
you go shopping at a certain time when you leave
the house. They know all of this stuff, and that's
why dangerous.
Speaker 1 (01:16:11):
Christine, did you say Peter theil Yeah.
Speaker 14 (01:16:15):
Peter, Chila need to say he you know, we all
look up to these billionaires like there's someone.
Speaker 1 (01:16:22):
The reason I just wanted to say that is because
you guys remember that we had that J six or
Adam the Villa Reale on he worked for Peter, which
is kind of a very interesting and problem.
Speaker 2 (01:16:41):
Yeah, the problem is we're not applying the laws on
the books. I mean, the reason trees and stuff like
that exist ben In Nick Garnand was you know, he
helped you know back in say but he gave aid
and comfort to the enemy. And that's why he's considered
when you talk about trees and to talk about bene
Atic Carnell, one of these billionaires that Christine is talking
(01:17:01):
about and everyone else, whether's Zuckerberg, whoever else providing a
and comfort to the enemy. You know, if the military
had social media and social engineering, which they're probably doing,
and AI and things of that nature, and you have
a Zuckerberg or Bezos or whoever wants to be Chinese
or a lot of people in your government decide to
(01:17:23):
give that aid and comfort, and that's treason. Treason is
an easy word to say, and it's an easy thing
to prosecute if you uncover the evidence. Why these people,
because they have money, are not being charged with treason?
You know, you're just work the J sixers. All this
J six stuff, this auto pen stuff, the Trump assassination stuff,
(01:17:47):
the election stuff. All that is interconnected with treason is
activity with foreign countries, in my opinion, But unless it's
investigated and attempted to be prosecuted by all these like
Peter Till and everyone else, we're going to lose this country.
Get you got mine, Domini. You got thirty million people
(01:18:07):
coming across the border. You have people locked up for
four years, the JAY sixers, you got a portal pin
and we.
Speaker 1 (01:18:15):
Got incidentally, did you see where the J sixers are
uh suing Nancy Pelosi for three one hundred and fifty
million dollars.
Speaker 2 (01:18:26):
They need to go to four thirteen. That's how machine break. Yeah, yeah,
it's ridiculous, But I'm sorry, Christine.
Speaker 14 (01:18:37):
Just even I be a peer tille of technic cruts
and tech crufts, like I said, not you meet up
on his dangerous philosophy. There's an interview you can even
find on YouTube, and you recently asked him basically, what
(01:18:58):
is the what is what is the future for humanity
with artificial intelligence? And his uh slow response or thereof
was completely talent. He hesitated. He didn't know how to
answer that question because he doesn't have good plans for
(01:19:21):
us and people him that are running these organizations and
these uh you know, these technology firms and you know
data harvesting companies. I mean they were also creating pollution
elon Musk's Frock artificial plant Memphis. First, they went in
(01:19:45):
and they lied and they said they weren't going to
have turbines or something like that because his operation takes
a lot of electricity and they could not handle the
local not handle the energy demains of this plant for black.
So Rusk decided he's going to build his own power
(01:20:09):
plant basically, but he still had to go through the
process of getting the permits and everything like that. So
they lied and said they were using these smaller turbines
like portable turbines that you use for a portable purpose,
you know, they're not meant to be fixed. So when
they went back to the plant, the turbines fixed, and
(01:20:32):
they weren't small, they were fixed to the top of
the building. And then they said, now they're running on
metine gas, which emits a lot of pollution. So a
lot of the people in the area and company about,
you know, breathing problems as much, getting their breathing conditions
and repiratory conditions and getting worse because these turbines are
(01:20:55):
running on a methane gas that produces a lot of
toxic by products like fromaldehyde, nitric oxide like that, and
these are just being thrown into the air because there's
no regulation anymore. This current administration just got rid of
I'm not kidding you when I tell you this got
(01:21:18):
rid of all the regulations for corporations to pollute. So
they're building these massive, massive plants to handle their artificial
intelligence without going through the pro channels and making sure
that what they're doing is not going to impact the environment,
poison people, you know, make them sick. It's only like
(01:21:43):
twenty thirty years after the fact we find out there's
a correlation between what they did thirty years ago and
the fact that now I have these massive amounts of
cancer in this particular area. And like I talked on
the Rust Show, a lot of these corporations put their
multipolluting plants in the areas income people live because they
(01:22:09):
just don't give a ship.
Speaker 1 (01:22:10):
Well yeah, and they don't pick up the trash either.
Speaker 2 (01:22:14):
Well well, look look at the whales in the windmills,
you know, yeah, you know, we talked about that years ago,
and nobody wanted to talk about these whales just all
of a sudden washing up along shore. For Passaic, they
had you putting masks on your face. That's basic biology.
You don't cover your face's grieve out take in. But
(01:22:35):
they had you cover your face for damn year. Well.
Speaker 14 (01:22:40):
The thing is they did this a lot. They did
this in Asia for years, like way before we got over.
Were always I don't know, I have to look into
it a little bit more deeply, but I would always
look at pictures from China and hump and they'd be
wearing like these masks. And this was waiting for so
(01:23:01):
they had some wars in place in Asian countries that
like if you would have the poor you have to
wear a mask. I don't necessarily think it's a horrible thing.
I mean, especially if somebody's sneezing all the time, do
you want them sneezing on you rather have an ask
the truth?
Speaker 1 (01:23:22):
Yeah? Okay, so I wanted to. I'm interested in your thoughts.
I mean everybody starts on the AI because I I
recently read a thing, uh medically related where you would
interact with a You go to the you have the
doctor's appointment, and the first thing is you interact with
(01:23:45):
AI and they kind of try to diagnose your problem,
and then they give and and and your your appointment
is ninety AI, and then you see a doctor for
the last five percent of the time. I mean, what
(01:24:07):
are your thoughts on that? Are you?
Speaker 14 (01:24:10):
I don't think it's a good thing, honestly, because we
are social creatures. We also have a thing called intuition, right,
you use it sometimes it works right, and when we
don't pay attention to it, we beat ourselves up, like
I should have listened to myself and done this. Well,
(01:24:33):
if you don't have a doctor, there another human being.
That medicine is an art. It's not a thing the
human touch. It takes the human interaction. It takes the
being there and noticing things about the patient important. You
(01:24:55):
can read charts, you can have artificial intelligence intelligence read
charts and take tests and measure things and come up
with an analysis of the data. Right, But it's missing
human elements that will always miss and unfortunately the whole
(01:25:20):
new industry developing for sex robots. When I saw that,
I said, Tom, that's it. That's the humanity because people
who like gravitate these robots and not have human relationships anymore.
Speaker 1 (01:25:37):
Where do you get these sex robots?
Speaker 2 (01:25:42):
Man years ago from that movie.
Speaker 1 (01:25:45):
Does Tennessee have a law against sex robots?
Speaker 3 (01:25:49):
I imagine we probably do the capital punishment case down here.
So we actually just had a law on AI. I'm sorry,
I haven't read it. That was one of the laws
that we talked about earlier. They they put a law
on AI down here. I'm not sure what the governance
of it is.
Speaker 14 (01:26:06):
That's a lot of a lot of states. A lot
of the states picking up have picked up the ball
on creating legislation to protect their citizens' rights. A privacy
is number one at the top of that list, survailing
(01:26:27):
people without a warrant. It's part of that as well.
You can't just surveil mass survail people, you know, so.
Speaker 3 (01:26:37):
There are spaircases coming with that is the fact that
you have a real expectation of privacy to information that's
on the Internet, and then you have a AI driven
UH package, whether it's Chat, bt chat, GPT or one
of the other ones, and you task it to data
(01:26:58):
mind the information and make targeting packages on individuals. It's
not who would you sue the person that's doing the
tasking or that'd be some I'm just bringing it up
because it would be an interesting court case, especially if
it's a commercial entityvice the government, a government entity. There's
(01:27:19):
a lot of companies right now that are doing ocent
open source intelligence. Yes, and these AI driven devices are
really good for the data mining piece of it.
Speaker 2 (01:27:33):
Well, how many of those these you know, every time
you pretty much turn on your TV, you got to
update it in a you got to check these boxes,
you got to agree to their terms of use. You're right, Greg,
you know it's gonna be interesting at what point are
we submitting or surrendering that information when we go online
and say is that now no longer forth Amendment seizure
(01:27:55):
because we agree to put it out there while we're
home search and things of that nature. Why their data
money or as they still in my information because we
only wanted ourselves to know we were searching for that
sex robot in Tennessee because they make the best ones.
And now everyone knows that.
Speaker 1 (01:28:12):
So we could go in Yeah, so we could go
into jet jet. That's true.
Speaker 14 (01:28:19):
That will happen that that you're purchasing your search history.
They have all that information, all your texts, all your
pictures you've ever taken. I mean, it's a good thing
in preventing crime. But as as the society grows more
(01:28:40):
and more fatalitarian, with an autocratic society, which is what
technocrats want, Yeah, then you're in trouble.
Speaker 1 (01:28:50):
Well, you've kind of described the World Economic Forum, right,
I mean, that's what they're I mean, So what you're
saying is that potentially at some point you could uh,
you know, inquire of the AI platform of your you know,
(01:29:12):
your preferred AI platform to tell you everything about Joe
Jackass that lives in Nashville, and it would tell you
everything is his Internet searches, where he lives, what kind of.
Speaker 14 (01:29:31):
You know, there's these companies that exist now, like you know,
the finding websites. Well, you could look up some of
these background you pay a fee and yeah, their report,
you know where they live, you know their telephone number,
you know, you know how much they're in for. You
(01:29:51):
can pay for this. But you can also get apps
that scrub your information off the Internet, but it doesn't
completely information. And then also if a government is involved,
if a company like Talenteers got contracts with the US government,
(01:30:11):
of course that's they're going to get that information over
all of it.
Speaker 3 (01:30:16):
I agree with what exact one hundred percent when you
say that. The other thing is is that it can
actually translate for you. So you're looking for information not
only in English, but you can get it in German, Chinese, Arabic,
and it'll spit it all that back at you whatever
language you wanted in.
Speaker 1 (01:30:38):
Why do they want all of this so we can
all sit in fifteen minute cities and you know, be
their slaves basically, right well.
Speaker 2 (01:30:48):
Just pretty much the social engineer and we talked about it.
You know, it's eroding. We started with okay, social media,
so now we all on the keyboard. We no longer
outside playing hopscotch, baseball, basketball, kingball, all stuff. Now we're inside.
Everything we're doing is inside. Then you go to the
next sphees and now in the next sphace, now you
got a doctor. You don't even have to leave your house.
(01:31:08):
You got to go to a doctor. Hopefully Medicaid covers
in the beautiful bill. But if you sit back, now
you got a doctor or a screen or some boxyhead.
Robot said, well, mister Jacobs's throwing the way your too
flips you have whatever, And it keeps going on and on,
And I don't know to say it with social creatures,
(01:31:32):
but I think, like you just said, John, they're starting
to isolate us. You know, the techno crash are isolating
us to our home. We don't even catch the bus
no more, you know, and interact with anyone. You know.
Speaker 1 (01:31:47):
The big picture is I mean, you know, I know,
probably everybody's astonished that there's one hundred and seventy seven
new laws in Tennessee. But I got to tell you
that no phone in the classroom is just the start
of it. I mean, they should, you know, they should
expand that, I mean, you know, speaking of of you
(01:32:12):
know people, not because there's no social activity, you know anymore.
You go to these college campuses, and the kids past
each other and they text each other. They don't wave,
they just text each other.
Speaker 2 (01:32:26):
Now, I'm greg a boy. I called this movie IDEO crazy,
and I've got out and brought it up a million times.
You have to watch the movie Idiocracy. It actually shows
that they you know, they're wardling the crops with tang.
They showed the dumbing down of people. There's so many
(01:32:46):
dumb people that the smart people. If you're getting dumbed down.
Everybody need to watch Ideo crazy, as I called it.
Even though the movie is actually called Idiocracy, it is.
Speaker 3 (01:32:57):
It's so it's fun.
Speaker 1 (01:33:00):
When I when I was in college, I was typing
UH reports at two o'clock in the or you know,
papers at two o'clock in the morning and add ad
a minute ten you know in the morning. You know,
everybody was busting their ass, you know, to try to
get it done. Now you just go to what uh
chap your AI platform of preference and have them write
(01:33:22):
a thing for you. Is that what they're doing, I
mean basically.
Speaker 13 (01:33:27):
They want to there, guys, is that they're freeing us
up and increasing you know, are of a quality of life.
Speaker 14 (01:33:42):
But what kind of quality of life is that? If
you if you don't have that human connection, you know,
instead of find it's as humans as animals who are animals.
We have certain like I said, sensory persons, intuitions when
(01:34:03):
you're looking at somebody's face, that's right. If you are
a good europe people, you can tell through just micro
expression if somebody's lying to you. But you're not going
to know that somebody is texting you, you see, right?
Speaker 1 (01:34:21):
So Christine, can you you know, answer this, how much
time do we have left before the complete hostile takeover?
I need to start eating, getting outside more.
Speaker 14 (01:34:34):
I really don't know, but I think it's going to
come sooner than later.
Speaker 1 (01:34:39):
Great, well, she's got another good point. Never see pickup
games happening anymore. Younger generations can't handle a power outage.
Speaker 2 (01:34:49):
They can't.
Speaker 1 (01:34:51):
Look at Las Vegas yesterday, right, all those did you
see that? All the light poles were you know, why
was that?
Speaker 14 (01:35:01):
By the way, what happened?
Speaker 1 (01:35:06):
What all the what did you guys see the pictures
of from Las Vegas? All the electrical polls on one
side of a long stretch were down, completely down.
Speaker 2 (01:35:22):
What happens when you pack it through your power grid
and things of that nature we're in trouble. We just
don't realize how much we've been infiltrated. And I think
that we would have been in more trouble if the
if the assassination attempt would succeeded, because like I said,
(01:35:42):
you got the order pin, you got all this stuff
going on. You know, you just got all this information
that China is into our system. All this stuff is
interacting Zuckerberg, like Christie said, sneaking with.
Speaker 1 (01:35:56):
So a lot of people don't like this. Go to
a doctor's office and talk to an AI screen. You know,
you have tell a med tell amed now where you
call somebody and they say, uh, you know, they do
FaceTime and they you know, it's going to get to
the point where you don't know whether you're talking to
(01:36:18):
a person or not. Right.
Speaker 14 (01:36:20):
That's the other thing that I wanted to address. I mean,
I do feel like there should be a disclaimer on
any kind of image or IDEO that's put out there
with with UH made strictly well this is this.
Speaker 1 (01:36:39):
Is the I mean just I mean, you guys probably
know this, but if TikTok, if you generate an ai
UH image and you just want to save it, you
have the there's a disclaimer at the bottom AI image.
But if you actually he posted there, there's no disclaimer.
Speaker 2 (01:37:04):
Right. Well, look at what Greg said and what Christine said.
Let's say you have in this conversation over the laptop
about your health. Now do any of combatants and everyone
else has it because of being communicated? You know, as
you see, did your medical history get ceased?
Speaker 1 (01:37:25):
Yeah? What if you asked Jay chat GPT tell me
about Joe Jack as his medical history, right, that's what
was the question. Well, let's say you ask eventually, if
you don't have serious protections you can get you know,
(01:37:50):
the aih you know, of your choice and ask Joe
Jack as natural what's his medical records? Right?
Speaker 2 (01:38:01):
Yeah, because this AI is communicating with this AI and
they saying, yeah, here's here's everything that's wrong with them. Yeah,
he's mentally unstable. I get that anyway without AI.
Speaker 1 (01:38:12):
But but I mean that's scary. That's exceptionally scary. If
I I mean, if if you can eventually put in there,
you know, tell me everything there is to know about
Joe Jack ash that that could be major problem.
Speaker 3 (01:38:33):
It's less smelly than trash picking.
Speaker 2 (01:38:37):
Christine, you always talk about the food. How about the
how does all this relate to the process foods and
things of that nature and the drugs? Everything is the
now you know the adderall the written in the processed foods.
Now you got this? How is do you think all
this stuff is interconnected?
Speaker 1 (01:38:58):
Yeah?
Speaker 14 (01:39:00):
I do believe that because when you have bad nutrition
or poor nutrition, you can't you can't be healthy, you
can't think straight, you know, So they will boil down
to instead of making the world a better place, like,
for example, why aren't we just using mutualic means to
(01:39:21):
fertilize the crops that we can build the nutrition back
up in the soil and produce healthy food that we
don't have to pay extra for. Everything revolves around money,
and whoever has money controls everything. But we have our numbers.
So what we have to do is turn around and
(01:39:42):
avoid things that we don't believe in anymore, things that
we don't support. So if you know you're worried about
certain things, then you have to do a little bit
every day just to help. If all of us just
did a little bit all together, we're just drops the order,
(01:40:03):
but all together we're raging river. We all have to
get together and do just a little bit, even if
we just do a little bit, like I said, if
you have a house, the one way that you can
help the planet is grow a buffly god and blow
up polony god. This way we can bring back the bees.
(01:40:25):
We need the bees and the butterflies pliny food so
that we don't start together we're ruining. Humans are ruining
everything in this planet. And that's where you do have
to have certain regulations because people who are in control,
if if you were to tell them, hey, you can
deliver the hell you want. You can dump your your
(01:40:46):
poised waste industrials and the river lit all the fish problem.
It's all about, you know, let's make money. That's what
it's about. So each of us pass to step up
and just if you can just one thing a day,
even if it's minimal, right helping, but everybody has to
(01:41:10):
chip in. They have to do something. So even if
it's shutting certain things, even if it's growing a certain bard,
make your own grow your own food, have food forest
for your own fruit and ups. Then yourself sufficient. You
don't have to worry about you know, the rest.
Speaker 1 (01:41:31):
Well for those for in that regard, I mean, there's
some people that are obsessed with whole foods. Right, Jake,
do you have a whole foods at your undisclosed location? Yes,
you do? And uh, Greg, do you have a whole.
Speaker 2 (01:41:49):
They're called farms, John farms.
Speaker 1 (01:41:53):
Okay, but I mean, you know, I'm talking about the surfront,
the whole farms, whole whole foods. So everybody he is
under the impression that they're all organic and all that shit,
and they're not. And somebody reported yesterday that the vegetables
that they have, or the fruits and vegetables that they
(01:42:13):
have reportedly are all sprayed with this appeal ship from
Bill Gates.
Speaker 2 (01:42:22):
So did you see that movie Tighten yet? Did Tightened
with the under the submersible?
Speaker 1 (01:42:31):
Yeah?
Speaker 2 (01:42:32):
Well did you see the guy who ended up getting killed,
the billionaire when he was trying to go around all
the regulations and he made the statement, well, I'll just
buy the politician, And that's what I believe. That's the
Christie said. I don't want to put words in them out,
but they're buying all these regulations to be usurped in
my opinion, so it takes the masses to overtake.
Speaker 1 (01:42:55):
Them exact.
Speaker 14 (01:42:59):
Numbers than just one individual alone. So we all have
to get together instead of fighting with each other. And
you know what a lot of social media has done
by design is polarize us. So we've got one camp
on this side and one camp on that side. But
I'm sure we got we see that we have a
(01:43:21):
lot in common, you know. So it's just really we
look up and say, looking at each other, We've got
to look up up like he's in charge, he's doing
this for us.
Speaker 2 (01:43:33):
You know.
Speaker 1 (01:43:34):
So as uh to Lindy Low's question, where do I start?
If I'm a beginner at wanting to start growing my
own stuff?
Speaker 14 (01:43:46):
You can just start with a container, uh container plants?
You buy a bucket or get a used bucket, you know,
And didn't you and plan the plan?
Speaker 1 (01:44:01):
Didn't you tell us? Didn't you tell us before that
there was a a type of you know, a label
that we're looking for, you know, is.
Speaker 14 (01:44:18):
Found Gamo labels on food. But honestly, we can't trust
that anymore because there's so much you know, anytime anybody
pays anything to put you know, their name on some
labe or something, there's got to be corruption. I mean,
it's just how much food is being like tested, So
(01:44:40):
I try to buy organic. But the best solution if
you can not. Everybody has time, but it's worth time.
Is to make for your own food this way, you
know you're not duping chemicals on it, You're being organics
in and not spraying it. You know where your food's
coming from and its out.
Speaker 1 (01:45:02):
Can't beat it, right, So let me ask you this question.
This is kind of a stretch. So, Uh, we've got
contaminated water. I mean, I've seen a map. I've seen
a lot of maps of contaminated water. You know, municipal
water that quite frankly in a variety of diseases, right
(01:45:25):
because there are p fasts forever chemicals or something in
the in the water. So what happens if you take
tap water from a municipal source that presumably has these
forever chemicals in them, and you use that to water
(01:45:45):
your garden? Uh? Is that going to screw up your
you know what you're growing?
Speaker 14 (01:45:53):
I mean, unfortunately, you're going to have to do whatever
you need to do to keep your garden going. So
if it's using tap water from time to time. But
obviously there's methods that you can use where you eat
water the garden. There's certain methods of planting, like, for example,
(01:46:15):
the warm moisture that you have in the soil to
begin with. So that involves organic manner, some deposed wood
chips mixed in with the soil, and then you keep
and then you mulch it. You put brass clippings on
top as you put hardboard on the bottomer. That keeps
(01:46:35):
the soil moist so you don't have the water as
much and it do need to water. That the preferred method.
I know it's a little complicated, but I would spend
the first setting up my garden, building up the soil,
building up your compost pile. Just take a year to
(01:46:57):
get it prepared, the soil and everything, and then when
you're read that, then you start. Then you start planting
and you can lay down. The most effective is drip irrigation.
You you these lines and then you cover them, cover
them up either multile material cloth, and then you have
(01:47:23):
a timer so you put the water on at night
where it's watering your roots. That's the best method to
prevent disease is to keep like water drip irrigation. There's
things like squash and cucumbers that get that no new disease.
Having wet leaves. You want to do this, prepare everything first.
(01:47:48):
This way, you're not wasting war. You're not watering at
noon when when it's the hottest part of the day
and it operates, so you're going to either do it
early in the morning, late night, you know late, you know,
like six o'clock at night when the sun goes down,
and I have it on a timer. So you're not
(01:48:09):
wasting a lot of water. But rainwater is the best
if you if you you know, I can't always get
all the rainwater, but people do use rainbowrows. They save
up the water, yeah, the roof, and.
Speaker 1 (01:48:22):
Then they know that doesn't have to forever chemicals necessarily
in it. Right, So hey, look we've got about a
minute left, but I wanted to mention that you can
read uh what Christine has written at New Living News
dot substack dot com, and you've got one coming out
(01:48:44):
soon relative to AI.
Speaker 14 (01:48:45):
Right, So I put something up in another week about
what we discussed with a little bit more detail. But
I mean, yes, AI can help it. It's helping in
the sense that h IT can table your information all
your blood work or your scans, your cts, your X
(01:49:10):
rays and into a comprehensive analysis of your health, including
genetic information. And it can tell you give you like
kind of a report of you know, uh what we
need to do in order to prevent this disease or
a recommended drug. It also AI is also helping speed
(01:49:33):
up drug development. So there's you know, there's pluses and minuses. Again,
I think we're getting we're getting too far ahead of
ourselves with sewing down and trying.
Speaker 1 (01:49:48):
Okay, well we gotta we got to wrap. And I
just want to tell everybody not to run any of
us through AI. Okay, anyway, as well know about it,
so don't do that. But read uh uh read Christine's
sub stack at uh you know, because you'll be coming
(01:50:11):
out with a new one.
Speaker 14 (01:50:12):
Right, So more regular with my writing bad So okay.
Speaker 1 (01:50:19):
Well you've been busy, right so okay, Well, happy fourth
of July to everybody. So I appreciate you guys taking
the time to Christine. We appreciate you taking the time to.
Speaker 2 (01:50:31):
Be with us.
Speaker 1 (01:50:32):
We'd love to have you back because we get a
lot more. I get a stack of questions there about
all kinds of ship so uh, you know, i'd like
to we'd like to pick your brain about that. But Jake,
thank you very much. Greg, thank you very much, Christine,
thank you very much. Uh and have a great Fourth
of July weekend, right, okay everyone, Yeah, take care guys.