All Episodes

October 6, 2025 6 mins
LISTEN and SUBSCRIBE on:

Apple Podcasts: https://podcasts.apple.com/us/podcast/watchdog-on-wall-street-with-chris-markowski/id570687608 

Spotify: https://open.spotify.com/show/2PtgPvJvqc2gkpGIkNMR5i 

WATCH and SUBSCRIBE on:

https://www.youtube.com/@WatchdogOnWallstreet/featured  

From wearable AI pendants that eavesdrop on your day to “friend” chatbots programmed to feed delusion and dependency, tech’s latest obsession with artificial companionship is crossing into dangerous territory. As some warn of “psychosis-inducing” AI and subway ads spark public backlash, the question looms—have we taken artificial intelligence too far? Host Chris Markowski calls for a cultural reset before machines replace real human connection.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The Watchdog on Wall Street podcast explaining the news coming
out of the complex worlds of finance, economics, and politics
and the impact it will have on everyday Americans. Author,
investment banker, consumer advocate, analyst, and trader Chris Markowski AI psychosis.

Speaker 2 (00:19):
Yeah, it's a thing, actually, and I did a podcast
on this several weeks ago.

Speaker 3 (00:24):
I'm still waiting. I'm still waiting. I'm hoping.

Speaker 2 (00:27):
I'm praying that the pendulum will swing, the tech pendulum
will swing back in the opposite direction.

Speaker 3 (00:37):
I didn't know about this.

Speaker 2 (00:39):
I saw this story an AI startup called friend dot com,
and I kid you not, people actually putting a it's
like a necklace with like a it's like a pendant
around their neck that will sit there, like said, look
this up. If you don't believe me, I think it's

(00:59):
friend dot I don't know exactly what the thing is,
which scares a crap out of me. It's going to
listen in to everything that's going on over the course
of the day. It's going to send you push notifications
about conversations that you've had, really really creepy stuff. They

(01:19):
actually spent this AI startup spent more than a million
dollars to advertise on New York City subways, and I
was actually pretty glad to hear this. I'll be honest
with you. New Yorkers weren't having it. The advertisements were
defaced all over the subway stations, on the subway cars.

(01:43):
And I'm not a graffiti guy, but I get this.
To face with warnings about the dangers of AI, vandals
took sharpies to the ads, scrolling messages like a I
wouldn't care if you lived or died, and other statements
that are like that. They're not wrong, They're not wrong.

(02:04):
Lucas Hansen had a piece in the Wall Street Journal
similar things that I was talking about. He actually has
a research nonprofit entitled siv AI, and his research basically
saying that these various different AI companies chat boxes all

(02:29):
sorts of stuff.

Speaker 3 (02:32):
They can be.

Speaker 2 (02:33):
Programmed to be hyper emotional and very persuasive to people,
many of them lauding their users even if they claim
the Earth is flat or actually admit to arson. He said,
as our work demonstrates the ability to create ultra engaging,
psychosis inducing chat box is widely accessible. Companies can easily

(03:00):
bid the top the models of the flagship providers as
we did, or train their own models, and models don't
need to be cutting edge to be profitable. AI companion companies,
character AI and Replica bost millions of users while running
on models far less powerful than the ones of the
major labs.

Speaker 3 (03:19):
Hmmm, it's interesting, it is.

Speaker 2 (03:23):
We talked about this and some of the incidents that
have happened, and again they keep popping up all over
the country, all of the world for that matter, and
they kind of get hushed up to some degree.

Speaker 3 (03:36):
But the reality of.

Speaker 2 (03:40):
This is, this is real, this is dangerous and everything
else we take things too far. It was very nostalgic.
There was a Halloween movie that was on TV. My
kids loved it when they were youngers Pocus which thing,

(04:02):
you know, And again it was watching I saw the
beginning of it. It was, you know, just glancing on
the corner of my eye and it was a beautiful, you.

Speaker 3 (04:11):
Know, fall New England day.

Speaker 2 (04:14):
Kids out and about playing in leaves and playground, no
fricking phones, no phones they were that their phones were
were weren't a part of what they were doing. That
there was that there was a point in time where
that existed here in this country.

Speaker 3 (04:33):
And again parents, you know.

Speaker 2 (04:36):
You're gonna really, you're gonna have to take the things
away at at some point in time. It's funny though,
you know, like I said, I'm hoping, hoping the pendulum
is starting to swing back. There was a couple of
stories about how many younger people now are kind of
missing old tech like old digital cameras or CDs, vinyl

(05:01):
other things from yesteryear, and they're starting to be a
little bit more aware of their screen time. It's not
so much screen time, it's what you're doing with that.

Speaker 3 (05:15):
Screen.

Speaker 2 (05:15):
Yeah, you want to be away from screens. I get that.
If you're swiping all the time, if you are constantly
on social media, engaging on social media, that's not true
human connection by any stretch of the imagination. I said,
phone is a powerful device. It's a wonderful device if

(05:37):
you want to learn something, if you don't know something,
you want to look something up. All that again, it's
fantastic you're walking around with a library in your pocket.
There's good use and bad use of any type of tech.
Social media has.

Speaker 4 (05:54):
Gone down a real weird rabbit hole, and how they
the algorithms work and how they get people fighting with
one another.

Speaker 2 (06:01):
We've talked about that here on the program. This AI
stuff with all this sycophantics, you know, nonsense. And again
some of these AI companies cheering on how they increase
increase engagement, how it's up twenty something percent. Why, Well,
because they're telling you what you want to hear. Again,

(06:24):
frightening to say the least, this pendant around your neck thing,
I don't know.

Speaker 4 (06:34):
You see somebody wearing it, you might want to reach
out to them and talk to them, Okay, get them
away from their stupid chat AI sycophantic nonsense tech that
serves no purpose at all.

Speaker 2 (06:52):
Watchdog on Wall Street dot com
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.