All Episodes

September 4, 2024 25 mins
Former aide to New York Governors charged with acting as an agent of the Chinese government. California is racing to combat deepfakes ahead of the election. The new recruitment challenge: Filtering out AI-crefted resumes by adding screening steps. Wally Amos launched and lost a cookie empire. His family reveals the trailblazer’s secrets.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're list Saints.

Speaker 2 (00:01):
KFI AM six forty the bill handles show on demand
on the iHeartRadio. F All right, this is KFI A
M six forty bill handle here and hopefully I won't
cough through this segment and.

Speaker 1 (00:17):
Fighting a cold and it is it's not fun.

Speaker 2 (00:21):
And cono, you know what when I'm coughing, you know,
cut off the end of my segment when I cough, Please.

Speaker 1 (00:30):
I don't know. Sometimes when you're about to cough though,
it's very quick. Yeah, it's a quick cough. That one
did come out of the blue. Well it did. It
hit me too.

Speaker 2 (00:40):
Usually I cough and grunt and belch and I don't
have a problem doing that on the air.

Speaker 1 (00:45):
I mean I couldn't care less, but that.

Speaker 2 (00:48):
One was just you know Howking and you know, it's
just it's horrific. Okay, now, and welcome everybody to kf
I AM six forty on a Wednesday.

Speaker 1 (01:04):
What is going on today?

Speaker 2 (01:06):
First of all, it is going to be hotter in
hell today, I mean Woodland Hills tomorrow is gonna be
one hundred and thirteen degrees. Hey, welcome to climate change,
a story that broke yesterday which was kind of interesting.

Speaker 1 (01:16):
Former aid. And this is the interesting part. Former ad
to current New.

Speaker 2 (01:20):
York Governor Kathy hohol Hocel and former Governor Andrew Como.
She and her husband were charged as acting as an
agent for the Chinese government. Now, this is not stealing secrets,
this is not grabbing a classified information. What this is
about is simply helping the Chinese government and doing things

(01:45):
as a government employee, right of which she is now
considered a government a foreign agent in violation of the
Foreign Agents Registration Act.

Speaker 1 (01:57):
And she's accused of visa.

Speaker 2 (01:58):
Fraud, smuggling, alien smuggling, money laundering. And why is this Well,
because on behalf of the Chinese government and on behalf
of the Chinese Communist Party, which by the way, is
one and the same. What she did is get millions
and millions of dollars. She and her husband, Boy, they

(02:23):
had it. They had it rich. The Chinese government really
wants to have themselves portrayed as good people and will
do almost anything.

Speaker 1 (02:34):
And she helped the matter.

Speaker 2 (02:36):
I mean, she made sure that the Chinese government was
in the best light in New York, that Chinese officials
were coming in meeting with people that they otherwise would
not have been able to meet lied about these Chinese
officials positions, and you can't do that.

Speaker 1 (02:56):
You have to register yourself.

Speaker 2 (02:58):
If you are helping a feign country, you have to
readure yourself as a lobbyist as an agent of that
foreign country.

Speaker 1 (03:05):
And not to do that is a big, big deal.

Speaker 2 (03:10):
So some of the stuff she got and her husband got,
and I love this and it's I mean, it really
is impressive. They bought real estate in New York and Hawaii.
They bought luxury vehicles, including a Ferrari and the big one.

(03:30):
This is going to give them plenty of jail time.
Is they arranged for the delivery of Nanjing style salted
ducks deliver to her parents living in the United States.
Now it is worth I found out. Of course, I
look that one up. The rest of it, I couldn't

(03:51):
care less. Nanjing style salted ducks are ducks that are
brined and then cooks slowly over time, and they're very
tender and they have the flavor and it falls off
the bone.

Speaker 1 (04:07):
So what I'd like to do is I've sort of
done this story.

Speaker 2 (04:11):
Their lawyers have said, of course, none of this is
true or it's exaggerated. They used have different lawyers. Of course,
husband and wife, they didn't do any of this. We
can't wait to go to court and we'll be vindicated,
the normal defense lawyer claptrap. And I don't know if
you've ever noticed that, but when lawyers say that we're

(04:33):
going to be vindicated in court, first of all, almost
never are they vindicated in court. And all of a sudden,
the lawyers tend to disappear when there is a conviction.
And I wish that reporters would go, hey, how'd that
work out for?

Speaker 1 (04:50):
Uh huh vindication? What do you think?

Speaker 2 (04:53):
So what I want to do is spend the rest
of the segment talking about Nanjing styles salted ducks, because
I think this is by far the most important part
of this story. I've never had Nanching style salted ducks. Now,
I've had Pey king duck, which is I mean, those

(05:13):
are crispy ducks.

Speaker 1 (05:15):
And those are dried in cold air.

Speaker 2 (05:18):
That's blown across, and they're usually served in these little buns.
They're like duck mcmuffins. But has any of you had
salted Nanjing style ducks?

Speaker 1 (05:30):
Guys? No, huh no, Okay, how much are they? I
don't know.

Speaker 2 (05:37):
I have to look that one up too. I have
to go Panda Express doesn't have them. I know that
in any case, she and her husband undisclosed agent of
the Chinese government.

Speaker 1 (05:49):
And this is jail time. This is no joke.

Speaker 2 (05:52):
Menenda's got nailed on that one where he was an
undisclosed Senator Menendez undisclosed agent for Egypt, and that is
going to cost him.

Speaker 1 (06:04):
Some big, big problems. So we have a story, and
the big story here is that this woman was Linda
soon made it up.

Speaker 2 (06:17):
The ladder to the point where she became chief of
staff in Hukle's office and was a big, big player
in Como's office, and then the whole time she was
an agent for the Chinese. I guess the only good
news is that the as I said earlier at the
beginning of the monologue, you didn't have secrets that were stolen.

(06:40):
This was not sabotage. It was a little sort of espionage,
but not even that. It was simply doing pr for
the Chinese government, putting them in a positive light and
eliminating or degrading Taiwan Uh in New York politics. All right,
I want to do worry about deep fakes right ahead

(07:03):
of the election, there were out Right after Kamala Harris
launched her presidential bid, there was a video that came
out and it was created with AI and it went viral.
She says, or someone that sounds much like her, and
you see her, I am your Democrat candidate for president

(07:23):
because Joe Biden finally exposed his senility.

Speaker 1 (07:26):
At the debate.

Speaker 2 (07:27):
I was selected because I am the ultimate diversity higher.
Now deep fakes are going on both sides and it's
just now a tool that's being used, and it is well,
you don't get the reality anymore. Now everything is so
convoluted information. When we talk about disinformation that comes out,

(07:49):
misinformation that we see on the internet, and now the
majority of people get their news from the internet, the
crazy stuff that comes out, I mean, what do you know.
You see Kamala Harris saying that it looks like her,
it sounds like her. So and then we have that
same issue with commercial ads. Tom Hanks promoting a product.

(08:15):
You have other actors doing the same thing and they
come out and say, no, that wasn't me. It's a
deep fake. Now the issue is when is this parody?
Because under the law parody, you can't sue someone for parody.
And how do you define parody. Well, parody is defined

(08:39):
effectively as something so over the top I'm paraphrasing now,
so over the top that no one believes it, and
therefore it really doesn't it really doesn't propose its message,
or it doesn't push its message. And I'll tell you
one of the most famous cases, and I love this,
and that was Jerry fall Well and a proposed ad

(09:03):
for one of the liquor companies. There was a cartoon
of in the ad and this, Oh god, I love
this story. He talks about his first sexual experience was
with his mother in an outhouse.

Speaker 1 (09:23):
And Jerry Folwell sued penthouse.

Speaker 2 (09:28):
And here's what the court, what the judge had even asked,
and that is during the course of all this, and
the judge sometimes is allowed to ask plaintiff's questions, and
in this case it happened. Do you think anybody believes this?
Do you think anybody out there believes that you've had

(09:48):
sex with your mother in an outhouse? By the way,
the cartoon shows him leaving the outhouse and he said, no, no,
no one believes it.

Speaker 1 (09:59):
And is it's fair to say then this was parody?

Speaker 2 (10:02):
Well, I guess so the case was dismissed because it's
so over the top. Kamala Harris saying I'm your Democrat
candidate for president because Joe Biden finally exposed his sinility
at the debate. Is that parody? Do people believe that? Well,
unfortunately they do. It's been a lot of years since

(10:25):
the Jerry Folwell case. And Elon Musk, who has a
tremendous amount of influence, who's endorsed former President Trump, shared
that video on x and two days later he said
it was meant as a parody. His initial post of
sharing that Kamala Harris the statement, and there she is

(10:49):
saying that she's the ultimate diversity higher. His initial post
was one hundred and thirty six million views. The follow
up post saying it is a parody had twenty six
million views, one hundred and ten million more views of
the what he claims is the parody, the a parody statement,

(11:12):
that parody post.

Speaker 1 (11:14):
And that is a huge problem.

Speaker 2 (11:17):
And we don't know what's going to go on, and
sometimes I don't think even the candidates have any control
over this.

Speaker 1 (11:23):
This is people.

Speaker 2 (11:25):
Who are just putting up because you can put up
anything up there, and with Ai Boy, it really gets
a little sketchy, doesn't it. So at California, there's a
bill up there that bars people from distributing deceptive audio.

Speaker 1 (11:40):
But here's the problem.

Speaker 2 (11:43):
You can ask and you have to go to court
to ask the platforms to take down what is either
parody or unfortunately a statement or a video that people believe.
So you have to go to court and you can
sue for damages. But what happens when it's released three

(12:04):
days before the election? What cost Hillary her election? Eleven
days before the FBI chief said, yeah, we're investigating what
was on Hillary's server in view of the Biden investigation,
in view of the Hunter Biden investigation that put it

(12:26):
over the top.

Speaker 1 (12:30):
And eleven days before it put it over the top.

Speaker 2 (12:32):
Can you imagine one of these crazy posts, ai generated
quote parody, making insane accusations or insane statements that a
candidate has.

Speaker 1 (12:49):
Do I think it's going to happen?

Speaker 2 (12:50):
Oh?

Speaker 1 (12:50):
Yes, I do.

Speaker 2 (12:52):
It's gonna go crazy and it'll be days before the
election and we'll see what happens with that.

Speaker 1 (12:58):
And I'm guessing the.

Speaker 2 (13:01):
Republicans are going to do more of it because the
attack on Kamala Harris are very personal. The attack on
Donald Trump are less personal, not that they're not personal,
because they are. But you know, Kamala Harris being a
fascist and a communist in the same speech, how do

(13:23):
you be a fascist.

Speaker 1 (13:24):
And a communist? That accusation within five minutes of each other.

Speaker 2 (13:29):
Now Kamala Harris is talking about him being unfit to
be president because of what he says and what he
has done. I think there's more credibility to that accusation.

Speaker 1 (13:41):
But then again, I'm biased. You know, you have to
take that with a grain of salt.

Speaker 2 (13:44):
You know where I sit with this election coming up,
and strangely enough, I'm going to do much better under
Donald Trump if he becomes president. I'm going to get
taxed out of my mind. If Kamala Harris becomes the president.
Matter of fact, I'll probably invite you to dinner in
my dumpster.

Speaker 1 (14:06):
It's going to be rough.

Speaker 2 (14:08):
But you know, I hold the president's presidency, and with
this presidency is sacrisint for me.

Speaker 1 (14:18):
Okay, enough of that.

Speaker 2 (14:19):
We go swimmingly into the political life and we're going
to say, you think I'm gonna talk.

Speaker 1 (14:23):
More about this, you bet? Okay?

Speaker 2 (14:26):
Ai I've talked about Ai so many times, and we
don't even know where i AI is going. This is
just the very tip of where it's going. One of
the places where AI is being used big time is
when people apply for jobs and when companies hire, because

(14:48):
with AI, you just sound really good. And whenever a
company is looking let's say there's a LinkedIn post and
they're looking for whoever in terms of these are the
skills I want, AI immediately puts your resume together and
all of a sudden, you have the experience and you
are considered now as a potential finalist. If you don't

(15:12):
know how to write a resume exactly what these companies want,
you're bounced.

Speaker 1 (15:16):
Even though you're.

Speaker 2 (15:17):
Qualified, you have the skill set, you want the job,
you would fit perfectly, and the problem is that you're
not going to be not even be considered because AI
also looks at all the job openings that are out
there that possibly you can maybe kind of sort of

(15:42):
your uncle who works in a deli, one of the
deli workers has told you about, and that is, oh boy,
let's apply for that job.

Speaker 1 (15:53):
Right. Do you have the skill set? Well?

Speaker 2 (15:57):
AI provides a resume which makes you have the skill set.

Speaker 1 (16:03):
It just does a great job to make you look good.

Speaker 2 (16:06):
The problem is it makes the hundreds, the hundreds and
hundreds of other applicants looking for a good job looking
for that job. So how does a company looking at
these applications filter out what's going on? Filter out the
applicants who don't have the skill set, who aren't going

(16:28):
to fit.

Speaker 1 (16:29):
Who clearly are using AI.

Speaker 2 (16:31):
Because there are programs that now figure out whether AI
is going to be used or is being used.

Speaker 1 (16:39):
And so let's get this kind of straight here.

Speaker 2 (16:43):
You use AI in applying for a job to make
yourself look great or to give you the credentials where
you would be considered. The companies are using AI to
figure out whether you're using AI. One of the things

(17:04):
that we used a form of AI when I was
hiring and I had the surrogacy agency, and we wanted
certain words used in the application and we would only
consider and it was sort of a well it wasn't AI.
But what we did is simply look at words. Did

(17:24):
they use the appropriate words in describing their skills? And
if we didn't find those words in the resume, we tossed.
And I think my partner at that time, did I'm
trying to remember whether there was a program or not
that did that, and I think it's it hasn't been
too many years, so I think that there was. She

(17:45):
did all the hiring, so I never paid attention. As
a matter of fact, I never knew anybody's name. And
are you surprised that I didn't know anybody's name that
we hired? There were people that were working for me
for literally ten years, and I would walk around saying,
what's your name? I do at the station too? Amy?

(18:06):
How how long was it before and I was introduced
to you several times that I called called you Amy?

Speaker 1 (18:12):
You mean how long did you call me?

Speaker 2 (18:14):
Am?

Speaker 1 (18:15):
Yeah? Eight months? Okay, that's great, it's.

Speaker 2 (18:20):
Good for you.

Speaker 1 (18:20):
And Neil was about six months yeah. And Cono, how
long was it before I knew your name? Yeah? Like
two month or two? Wow? You know what?

Speaker 2 (18:30):
That is fantastic because people work for me for years.

Speaker 1 (18:35):
I'm a memorable guy.

Speaker 2 (18:37):
Valentine, I was, and I'm not exaggerating. I run in
I've run into Valentine a few times. Who is the
DJ in the morning? What's the station he works at?
Because I've been told him, yeah, it's I don't even
know the stations we have here on the cluster, even
though I've been told over and over and over again.

Speaker 1 (18:58):
So I would run into Valentine. I don't know how
many times. Probably I go who are you?

Speaker 2 (19:07):
And he said, I'm Valentine. And the next time I
would see him, I go, now.

Speaker 1 (19:11):
Who are you? You look a little familiar.

Speaker 2 (19:13):
I'm Valentine. That probably happened, and I'm not exaggerating. You
can ask Val probably fifteen times before I finally remembered
his name.

Speaker 1 (19:26):
I am not kidding.

Speaker 2 (19:27):
As a matter of fact, whatever award I got, and
Val did a video, which you know people do when
they can't show up the award show up at the awards.
And what he would do is he went through a
bunch of storyboards. He went through some posters and he
started with my name is Valentine. And he put that

(19:48):
down and he said, Bill, my name is Valentine.

Speaker 1 (19:52):
I just forgot.

Speaker 2 (19:55):
And so people have worked for me for years and
I I don't know why I went to that story.
I digress and has something to do with AI being
used for application.

Speaker 1 (20:06):
It's the AI fight. You know, my dad can beat
up your dad.

Speaker 2 (20:10):
And it's a real problem because you have people that
have the skill set, who are a fit for a
job and the company is being inundated with hundreds of
applications for every job, and what do they do well?
In some cases, solution is even more tech. Bright hire

(20:32):
as a company, they're a software company, and they record interviews.
They actually have interviews, and then those interviews are looked at.
Now even though they figure out who it is, they
get to maybe the top ten or fifteenth through AI
at least the finalist is in front of a camera
talking to someone. There is one recruiter where he talks

(20:58):
about several clients have told him that they have they
have the skill any when you're looking at someone who
can ask you questions, well, I don't know. Let's say
a question is being asked and you are looking at
off to the side, they chat GPT answer. I mean,

(21:20):
you can try to do that. Your eyes go off
to the side, or you have a headset on or
one of the little earbuds, and it takes a recruiter
or it takes an employer not very long to figure
out you don't know what the hell you're doing. Another
way they do it, but keep in mind this is
old school because.

Speaker 1 (21:36):
They're talking to people.

Speaker 2 (21:40):
Is how much skill do you have in encoding, for example,
and your resume, because you're the AI program that you're using,
figures out this is what the company wants portrays you
as the fit.

Speaker 1 (21:59):
And so the interviewer says, okay, let's see what you
can do. Go ahead. Now, maybe you go.

Speaker 2 (22:06):
Ahead and use chat GPT. But theviewer says, go ahead,
how'd you get there? What were the questions you asked?
And all of a sudden you can figure out or
they can figure out this isn't a kosher because they'll
actually show or ask candidates to show them what the
skill set is. Also just looking at resumes when they

(22:31):
claim they have.

Speaker 1 (22:32):
A knowledge of AI, for example.

Speaker 2 (22:35):
And claim they've worked at companies that frankly weren't even
using AI at the time they worked for them. Okay,
once again, the employer is using AI to determine that
the AI that's being used by the prospective employee is well,

(22:57):
let's just say it's exaggerated instead of outright lies.

Speaker 1 (23:01):
Now I don't know.

Speaker 2 (23:02):
Well, if you're looking for a job your resume, who
doesn't lie on a resume or at least expand on
a resume make yourself look far greater, far more important.
There is no such thing as a janitor anymore. It
is a much more important sounding job. There's no question

(23:27):
you're a maintenance engineer. If you are a well, let's
say you're working behind the counter, right, You're not actually
just being a behind a counter. You are a customer
specialist dealing with clients.

Speaker 1 (23:46):
Well, can AI figure that out? Yeah? Which AI is better?
Does AI figure out that, for.

Speaker 2 (23:56):
Example, you really were a customer specialist, that there was
training involved, that there is that level of employee at
the company. All right, So what's real and what isn't?
And they're just starting to figure this out, they really are.

Speaker 1 (24:13):
Just it's it's tough. I don't care about residme.

Speaker 2 (24:17):
You don't have Well, here is the point is start
your own business and then you don't care. You know.
We talked about my phenomenal law degree that I got
and I came out, took the bar past it, and
you know why I started my own practice. I couldn't
get a job in a real law firm. So I've

(24:42):
been a lawyer for a long time and I've never
worked for a company. And I was a complete drug
addict my first four years. That's my resume. It's pretty impressive.
I have to tell you, drug addict lawyer, I'd hire me.

(25:04):
I would, all right, you've been listening to The Bill
Handle Show. Catch My Show Monday through Friday, six am
to nine am, and anytime on demand on the iHeartRadio
app

The Bill Handel Show News

Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.