Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The Watchdog on Wall Street podcast explaining the news coming
out of the complex worlds of finance, economics, and politics
and the impact it we'll have on everyday Americans. Author,
investment banker, consumer advocate, analyst and trader, Chris Markowski.
Speaker 2 (00:16):
Jurassic Park and ai.
Speaker 3 (00:20):
Ian Malcolm played by Jeff Goldblum in Jurassic Park one
of my favorite characters as far as action adventure movies
are concerned, without a doubt. Some great one liners in there,
and one of my favorite that I've often used here
on the program was your scientists were so preoccupied with
(00:42):
whether or not they could that they didn't stop to
think if they should. And you could take out the
word scientists, you can put in investment banker. Could throw
a myriad of things in there, and it's applicable to many,
(01:06):
many different things. I have put out warnings over various
different bits of technology over the years and the potential
for misuse and what could possibly happen, and go back
and go back, I mean going back to the early
(01:27):
days of MP threes, MP threes.
Speaker 2 (01:31):
And Kasa and They're Free.
Speaker 3 (01:35):
It was Napster was out there, and the problems I
remember metallic and some of the bands were like, this
is ridiculous. You know, we can't make any money on
our work anymore. And you can watch you can go
back to that point in time where every decided to
stream everything and everything was going way cheap, and people
(01:55):
are not buying albums anymore. They're just downloading and tie stuff.
And you can see the most certainly the decline and
the quality of music. I'm sorry, it is what it is,
you know, the rise of auto tune and fake artists
and all of this other garbage that's coming out. And
(02:15):
now what we're going to have is we're going to
actually have AI music number one country song in the
United States. Think last week was an AI song, no
people involved. And I went back then and I you know,
I talked about music as my love of it is
(02:36):
the relationship you had with it. And I think some
people like myself, you know, back into vinyl once again,
you know, holding it, the art, all the things that
went along with that, and that's been taken away.
Speaker 2 (02:51):
I don't think Steve Jobs fully realized what his device would.
Speaker 3 (02:56):
Do and how it would go about changing things, because
it has, without a doubt anyway, anyway, move on from that,
we can move on to social media. We hear on
this program were upfront and honest. I remember I remember
being a It was a guest appearance on the CW
(03:19):
Daily Buzz program back in the days like two thousand
and seven to like twenty ten, I was doing once
a week. I'd have to go out to Orlando and
do the show there, and I remember there was a
battle between Justin Timberlake and Austin Kuchner who could get
more followers on Twitter when Twitter first came out, and
(03:39):
I said, this is not going to end well, it's
not going to end well. I mean, you think about Facebook.
You know, you watch the movie. If you're not familiar
with the story. It was basically basically a program was
putting together to meet girls, and it turned into something
quite frankly, that I don't care for. Do I have
(04:00):
to use it, Yes, Do I interact on it? I don't,
much to the chagrin people that you know, put together
this show, and I am not getting into conversations and
feedback and doing that because I just don't think it's
a very humane forum by any stretch of the imagination.
Speaker 2 (04:24):
I don't. It's ungodly to me to interact.
Speaker 3 (04:27):
With people like that and what people have taken it too,
and how they treat others. You know, the things that
people will say online. There was an old thing when
we were kids, say it to my face. Most of
them would never do that, and again that's a bit
of a problem, and I think it's done more. Again,
I gave the comparison to the pink slime from Ghostbusters
(04:49):
to the negative energy that has just permeated this country
and pitting people against one another. And those algorithms as well.
They're pretty smart. They know what to do, they know
what to feed people. They know what to say or
what to put in your feed to get you going,
to get you upset, all of those things so you
get your clicking on things. So again you got more
(05:11):
eyeballs and they can make more dollars. Again, I understand
the concept of advertising and ratings and back in the day,
but this, this is, my friends, is a little bit ridiculous,
and we all understand the whole dopamine hit bit that
it gives you and how.
Speaker 2 (05:28):
It affects people.
Speaker 3 (05:31):
Now we're on to AI, and this is a pretty
pretty powerful tool.
Speaker 2 (05:42):
I said this before.
Speaker 3 (05:42):
I use AI for search first, for looking certain things up.
I use AI to take like a transcript of one
of these programs, to put it in there and to
clean it up. I don't allow AI to think for
(06:03):
me by anti stretch of the imagination. Greg Yet had
a piece in the Wall Street Journal, the most joyless
tech revolution ever? Is AI making us rich and unhappy?
Discomfort around artificial intelligence selfs explained the disconnect between a
solid economy and an anxious public.
Speaker 2 (06:20):
People are concerned. They don't know whether or not they're
going to be able to hang on to their jobs.
Speaker 3 (06:24):
They don't know whether or not they're going to end
up being replaced by AI and what it all means.
Speaker 2 (06:29):
And we've had.
Speaker 3 (06:30):
Quite a few apocalyptic movies when it comes to artificial
intelligence over the years, and what could happen. I just
rewatched Blade Runner, a genius move with nineteen eighty two
gu AI applied into robots that are put out there.
Speaker 2 (06:48):
We all know wargames.
Speaker 3 (06:49):
You could talk about Terminator, that there's so many of
them that are out there. When these things can get
out of control, and we're already seeing certain AI pros
basically wanting to self replicate protect themselves from being shut down.
We've seen the various different hallucinations that some of these
AI things have where they're just making stuff up, making stuff,
(07:15):
making court cases up, doing various different things. So they're
not reliable again as a tool that you have to
pay attention to and you have to be aware of
and what it's capable of doing. Several years ago, I
guess has got to be going back to like twenty sixteen,
twenty seventeen. Here on the program, I spent some time
(07:38):
talking about this new thing that was coming up. It
was called deep fakes, where it was essentially it was
early AI where you could basically fake videos and fake photographs,
and it was pretty obvious back then you could see it.
Speaker 2 (07:56):
Not so much.
Speaker 3 (07:58):
Anymore. I, for the life of me, don't understand why
some of these apps like Sora, where you can take
somebody's picture and likeness and.
Speaker 2 (08:12):
You could.
Speaker 3 (08:14):
Make them do horrible things and put it out there
and there's no water stamp on it. There's nothing on
it to say, hey, this is not real, anybody. You
kind of study all of the nonsense in the lead
up to World War One and the assassin assassination of
the Archduke Ferdinand. You know, obviously, you know that he
(08:36):
was a Serbian separatist and you got the Austrian Hungarian
crown prints going to be there, all sorts of issues.
Then it just it got out of control, various different
information flying around that wasn't true. You're talking about you're
talking about the ability to push people in a certain
direction via fake videos or a you know, fake tear,
(09:01):
whatever it may be. This is a better example of this,
quite frankly, is the film Wag the Dog, which was
its genius, great movie. Robert de Niro, Dustin Hoffman, Boudody
Harrelson's got a little small part in it as well.
And how you know, the government and this is before
(09:21):
all of this technology, you know, manufactures of fake war
against Albania to get the president, you know, out of trouble.
You can see how this stuff could work like this.
South Park did an episode on it last week, and
some of the things that this thing can do, you know,
(09:43):
it's amazing.
Speaker 2 (09:44):
You think about all of the danger.
Speaker 3 (09:46):
You can see the danger that goes along with some
of this stuff and nothing being done. Let me give
you an example. Father Mike Schmitz, he's a bit of
a rock star, and Catherine Circles, he's a part of
the hallow app and ascension, and it's just unbelievable things
(10:07):
from all around the country. People are using deep fakes
and fake videos of him asking people for money. He
had to put out a whole thing. He said, I'm
not doing this, This is not me. People using his
likeness to go ahead and do that scares the crap
out of me.
Speaker 2 (10:30):
It does. It scares a crab.
Speaker 3 (10:33):
And somebody could take my likeness and put it out
there and say that I'm recommending this stat or whatever
it may be, and people can act upon that. Why
are we not doing anything to rain this? And you
don't think that this is a danger for crying out loud,
I mean, we don't allow kids to drink until they're
(10:55):
twenty one here in the United say, sure, they can
go off and they can kill themselves in some Neocon war,
but not that they can't drink until they're twenty one.
Speaker 2 (11:05):
But we allow something is dangerous. This is not more dangerous.
Speaker 3 (11:12):
You know, the big alcohol lobbyists just you know, you know,
basically wreck the hemp industry here in this country because
they want people drinking more again. And you mean to
tell me that that's more dangerous than this. You don't
(11:32):
see the tremendous potential for all sorts of nefarious, if
not deadly, things that could happen.
Speaker 2 (11:42):
With this.
Speaker 3 (11:44):
And we all know that in our society that a
lie will make its way millions of times, millions and
millions of times out around the globe before the truth
comes out out the front door. Was it Ben Franklin
(12:05):
who said that, you know, lyle makes its way around
the world or around the country again, it was his
point in time. Now with technology, it's it's going going
viral millions of times around the glob before the truth
gets out the front door. You see the type of
harm that could be done with this, and we're just
(12:29):
taking a step back, you know. You take a look
at this, this sourra app and what it can do.
And South Park parodied this past week with the kids
on South Park putting out horrific videos about each other
and putting it.
Speaker 2 (12:46):
Online doing you know again, you gotta watch.
Speaker 3 (12:50):
You know, I'm not going to describe the things on
the show that various different cartoon characters were raping kids
and all sorts of stuff against South Park's a little demented,
But neither here nor there.
Speaker 2 (13:01):
There's that other line from Jurassic Park.
Speaker 3 (13:03):
What was it Ian Malcolm saying, you guys, he was
talking about, you know, the the the awesome power of genetics,
most of awesome power in the in the world, and
you guys are wielding it like a kid is wielding
his father's gun in the house.
Speaker 2 (13:17):
You don't think that this in the hands of kids.
Speaker 3 (13:23):
You don't think the type of bullying that could take
place with things like this, You don't think that this
is the slightest bit dangerous at all. Listen, people, I
don't I don't know what to tell. I'm not a
big rules and regulation guy, but I also believe you know, again,
(13:46):
I like to think that we live in a world
where people can police themselves, but we don't. Okay, we
don't live with It's a post Christian world we're living
in right now. Most people don't fear God at all.
Speaker 2 (13:57):
They don't.
Speaker 3 (13:58):
Then it's again, they're gona do whatever they make. It
makes it very difficult to be a libertarian, you know,
type of a guy when you live in a world
where that many people cannot police themselves. You know, I
don't know how you're gonna put the genie back in
the bottle with this.
Speaker 2 (14:15):
You probably aren't, but.
Speaker 3 (14:18):
Don't you think we should have serious penalties for messing
around with people's likeness? Serious rip percussions. I'm talking serious
jail time type stuff. I mean, you could destroy someone's reputation,
(14:39):
and you know how much we gotta fight to get
that back after something like this. Again, I'm just saying,
I'm just warning you, and again I'll give you another
Ian Malcolm quote. Boy, I hate being right all the time.
(15:00):
Watchdog on Wall Street dot Com