Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Big, big show rolls on inside the lock one hundred studios.
That music you hear, and those very talented gentlemen you
see right now on your TV screen, the Date Trader Trio,
best in the business. We get to sit and listen
to them every single day.
Speaker 2 (00:16):
Sit here at the bar. It's Costa Marryburg Godwin, our
executive producer, Greg Tatarov. It's soul it up.
Speaker 3 (00:22):
So I got a I got a one hundred and
twenty five million TV home strong thanks to the TV
and thanks to American Life Network and American Forces Network,
one hundred and seventy five countries and all the ships
at see.
Speaker 2 (00:31):
I got an email from Capitol one. Oh let me show.
Speaker 3 (00:35):
You here, mister Fulton, you are ninety days.
Speaker 2 (00:39):
No, that's not that. No, uh here it is? Where
is it?
Speaker 4 (00:44):
Here?
Speaker 3 (00:46):
It says, did you mean to tip twenty five percent at.
Speaker 2 (00:51):
The so and so restaurant? Wow? Wow?
Speaker 3 (00:55):
Houty here yeah? So did you mean to tip twenty
five percent on your bell as a matter of where
as a matter of fact, I did, like I did
tip twenty five percent because the guy was fantastic and
it was it was exceptional service.
Speaker 2 (01:09):
Plus you know, I looked at their shoes and and uh,
you know, daddy's zoop pair of shoes. I gotcha, gotcha,
you know. But I thought, you know, it's just kicked down,
and it wasn't that much money. Kicked down a little bit.
Speaker 3 (01:18):
And I thought to myself, wait, why was a waiter?
I mean, could you the capital one is now going
in and second guessing.
Speaker 2 (01:27):
Our ability to tip. Now I now, on the other hand,
that might be a good thing. I mean, it's doesn't,
but it does it. It's a little it's a little icky.
It's a little licky.
Speaker 5 (01:36):
Yeah, And it's a close call, like twenty five percent,
that's not that big of a shore forty five your
stand if it's like if you like, did you mean
to leave four hundred dollars on that twelve dollars?
Speaker 2 (01:47):
How about? It's also none of their business. Wasn't tipping?
You know?
Speaker 3 (01:53):
Yeah, I'm trying to find the exact words here, because
I thought to myself, well, that's kind of go oh.
Speaker 2 (01:58):
And it wasn't a privay thing for me.
Speaker 3 (02:00):
It was more of a I was taking the waiter's side
because I was a waiter for a long time.
Speaker 2 (02:04):
You know, if you want to make any money, if
you listen. If you're and I know we have a
giant audience from the ages of twenty ninety.
Speaker 3 (02:10):
If you're a young person and you are trying to
make it through school, either get a kob at Costco
or get a job at a waiter, because either way
it pays great minimum waves. It's got you know, I
remember working twenty five hours waits.
Speaker 2 (02:22):
On fun job.
Speaker 3 (02:23):
But yeah, but I'll tell you I was thinking this
whole tipping arrangement here. First of all, it's a double
edged sword because you had waiters. You know, they're not
blaming all their tips, therefore they're not paying taxes on
those tips.
Speaker 2 (02:39):
Then there was a.
Speaker 3 (02:39):
Time when they when when when they pulled together and
they they decided that we're going to estimate your tips
and we're going.
Speaker 2 (02:48):
To tack you at tax. You on that allow me
of the crappy servers get to pay you had to
pay tax.
Speaker 3 (02:52):
And then and then now you're starting to think, well,
now we're back to normal where you know, waiters and waitresses, servers,
bartender and the like are getting minimum wage basically because
they live on what they live on tipips.
Speaker 2 (03:05):
I don't want capitol one going in there.
Speaker 3 (03:06):
And second, guessing my clients because what if Because first
of all, what if we were all out in a
meeting and it was such a great thing and the
guy was went above and beyond, you know, because once
you know, as usual, Greg's ordering, you know, no oil, no,
it's on the side, pressing on the side, the thing
and the other thing.
Speaker 2 (03:24):
But let's imagine it's a great it's it's a great
it's one artequila. It's a great server. And so we
did tip thirty fantastic eye.
Speaker 3 (03:35):
Mama gets the bill and says he would never tip
forty percent.
Speaker 2 (03:40):
Right, Yeah, thankfully.
Speaker 3 (03:41):
I live like Batman with myself and Bruce Ward, which
is one hundred forty pound bergice mountain dog in a
castle by myself and don't get out.
Speaker 2 (03:49):
However, could you imagine.
Speaker 3 (03:50):
If I if there was a significant other that that
looked at the bill. I mean, isn't that part of
the thing where now I'm ticked off as a waiter?
Speaker 1 (03:56):
Yeah, yeah, I yes, very much so. But again I
I'm it's.
Speaker 2 (04:00):
Like, it's if I want to be magnanimous and very
forthright with people and take care of them, because it
don't say hello, welcome to the cost of residence.
Speaker 1 (04:10):
Well, yeah, you know, keep your feet off the fur
lined floor. But I would say I don't. I don't
want anybody. What's what's your standard? What do you what's
your Well?
Speaker 6 (04:19):
I was just gonna say I agree with you that
I think it was too low of an amount, because
people do twenty five percent if it's a good a
good service, if you're upwards of forty five fifty up
when they could maybe flag it.
Speaker 2 (04:34):
But but how do you okay? So let's say it
was forty percent? Are you okay with him flagging? Yeah?
You were? I mean yes, because a lot of people
I kind of have.
Speaker 3 (04:44):
To hear with the two factory authentication, you know what
two five eight zero password, everything you can steal everything
I got.
Speaker 2 (04:52):
Look, some people are served.
Speaker 6 (04:55):
Let's say you're Greg and he maybe couldn't see stray
when he's writing the tip any with the wrong number.
I mean that's maybe happened to me once or twice
in my life.
Speaker 2 (05:04):
I quote the wrong number, given it to a guy.
That's a whole different story.
Speaker 6 (05:06):
Well that too, But I mean I'm talking about my
ability to add the tab up. You know before nine
years ago, it's not I was sometimes I was overserved.
And I didn't write the right number.
Speaker 2 (05:19):
You know what this tells me?
Speaker 5 (05:20):
This tells me that the credit card companies have had
to cover a lot of them.
Speaker 4 (05:24):
You know.
Speaker 5 (05:24):
That's people have said, wait a second, if I didn't
leave that tip, and by that point it's too late
for them to get the money back totally right, So
then the credit card companies have to eat it. There
has to be an issue here that's causing it. Because
that's yeah, that's pretty much that.
Speaker 2 (05:38):
That's that's just a great point. This has to do
with this has to do with lost prevention. Yeah, sure,
what's your So what do you so? Bust it out? James?
Would you be okay with it? Would you give me
okay with it? With the credit card company doing that? See,
doesn't it feel icky?
Speaker 1 (05:51):
But last night it would have worked because I got
uh salmon dinner and they charged me sixty five bucks
for the.
Speaker 3 (05:58):
I wish pub yeahs who that's what I wanted to
catch that stuff?
Speaker 2 (06:04):
So what is your standard go to? What do you are?
Speaker 7 (06:06):
You?
Speaker 2 (06:06):
Are you a it's yours to lose tipper? Or are
you start with zero? Bring it on?
Speaker 6 (06:12):
No, I'll do twenty percent unless they really lose it.
Speaker 2 (06:16):
I think I'm twenty. I think I'm a.
Speaker 1 (06:18):
I'm a it's yours me too, I'm like Judge smells
and caddy shock. I just slide that quarter across the table.
Speaker 2 (06:26):
Dan here there you go. Do you guys tip on
tax Yeah, the whole thing. I don't tip on ta
I don't.
Speaker 3 (06:32):
You're doing math in the restaurant at the end of
the bill my phone or go to the bar.
Speaker 1 (06:38):
No, it's embarrassing because my wife. My wife can do
in her head like this. I've got to go do
my calculator on my I carry the two pretty much.
Speaker 2 (06:47):
Three. Do you have an advocate? Second? Borrow?
Speaker 3 (06:51):
Okay, So this leads into the other story I wanted
to talk about that I was gonna talk about anyway,
but this hijacket. So I was in another city in
a in a hotel and it was a Hilton property.
But it wasn't in four seasons. It wasn't, you know,
a western kind of foofy thing. It was a Hilton.
(07:12):
But it was a beautiful hotel and it was I mean,
it really was. It was like it was one of
the coolest hotels ever been. I was really surprised.
Speaker 2 (07:18):
As Hilton, you realize hotel companies don't own the property.
They just run the Yeah, you.
Speaker 3 (07:23):
Know, they they they put their template over whatever the
property is they run. I had to be somewhere in
a half hour, hadn't eaten. It's two pm in the afternoon.
I was just gonna get something quick.
Speaker 2 (07:38):
Walked up to the bar. Nobody's there. We're talking about okay,
nobody's there.
Speaker 3 (07:44):
Guy walks up and he says, hey, I can get
your water, but after your waitress on the manager, which
was kind of a weird sort of just you know,
just take them.
Speaker 2 (07:52):
One lady comes up, very nice lady.
Speaker 3 (07:55):
And there was another guy walks up to the bar,
kind of a drunk, chatty traveler type guy you can
tell was, you know, trying to chat up the bartender.
And he was a taco shi of six hundred pounds,
and he was like talking about how great the food
is it? What's for dinner? Like I was like, dude,
you don't want to lead with that at this but
maybe lead with him. I'm gonna work out where'sus gym?
Speaker 2 (08:12):
Anyway?
Speaker 3 (08:13):
The so I I just want to know what's too
how much is too long? Because I said, because I
sit down and I ordered what I ordered. I ordered
wings or something stupid. It was something okay, no, I
don't have no, but I'm saying that that's very easy.
So well, and I was on a zoom call observing
(08:35):
at the bar, so the phone's kind of propped up here,
and I'm just kind of like, you know, I had
it on, you know.
Speaker 2 (08:39):
I was trying to want to talk to him. I
was just trying to concentrate. Realized like, okay, this is
like eighteen nineteen twenty minutes ago.
Speaker 3 (08:45):
By now, I said, hey, hey, I go are we
getting close because I got to jump in here just
like that, they're literally putting the wings in the play right.
Speaker 2 (08:52):
I don't kicker. It's now two twenty nine.
Speaker 3 (08:57):
She's poured me a drink, not like a soda, which
I which I granted took a sip out of, but
didn't drink the whole thing. I looked down and it
was just like now, it's like kidding, like stupid. So
nobody to tell I left. Good now, forgot about it,
forget out if that happened. Sometimes it is what it is.
Speaker 2 (09:16):
I I go back.
Speaker 3 (09:19):
That evening and I asked the same manager guy, I go, hey,
how long is you gut still serve and dinner because
I still I hadn't eat yet.
Speaker 2 (09:27):
Right, it's not like nine o'clock, yeh, And yeah, I
go and today it took like thirty five minutes. Oh well,
we put that on your tab. And you know what,
I don't want any lift from me. You were rude
to the waitress. Oh my god.
Speaker 3 (09:41):
And he says, and he says this, He says, well
that on the ticket there it says it was eighteen
minutes ago. Oh, really, show me the ticket because I
took a picture of my of my ways. I accidentally
took a screenshot because I was on zoom of my
not my wais my uber. Yeah, and I took a
picture of the drink as I left, just to see
to send to my buddy. Still no food, two thirty five, Right.
Speaker 2 (10:05):
I go show you the thing.
Speaker 5 (10:08):
And it.
Speaker 2 (10:10):
So in front of dressed me down, in front of
the entire bar. You know, I'll get you right out
here out of the Hilton. What's a bed bothered me
all night long? Did you eat?
Speaker 7 (10:22):
No?
Speaker 2 (10:23):
But talk to the general manager the next pre I said, look,
I just want to say, he goes, sir.
Speaker 3 (10:30):
Three of the patrons came up to me today and
talked about this guy yelling at a customer.
Speaker 2 (10:35):
Came to the rescue. Nice to time, broken thoughts right
twice a day. I love it man, Good about you now.
The guy was like, miss and mister you were you
were root of the waitress. You want we put that
on your tap? Apparently been there for two weeks. Yeah,
no longer. There wasn't rooting for that. No, I'm just
(10:57):
saying though, by anybody you've seen me missing stars.
Speaker 1 (11:05):
We continue live from the Loft one hundred Studios inside
of the Loft one hundred Studios here in beautiful sunny
southern California. That is the day Trader Trio best in
the business. As they say back there at the bar.
It is Costa Maryburg Godwin. Our executive producer is Greg
Totterrod and of course.
Speaker 2 (11:23):
Sully go to see you guys today.
Speaker 3 (11:24):
One hundred and twenty five million TV homes strong in
this country every single weeknight and on the.
Speaker 2 (11:28):
Weekends of course, thanks to this TV, American Life Network
and American Forces. Now we're going one hundred and seventy
five countries and all the ships to see God bless
our military men and women through always overseason underway. What
do you know about AI?
Speaker 1 (11:41):
And for our next guest, Michael United States Navy Captain retired.
He's the Chief Strategy Officer COO of Foundational A. Brian,
is it foundational l M A or just foundational AI.
Speaker 4 (11:54):
It's a foundational LL foundational l what is that? What's
a large language model is what l M stands for.
So that's the new generative AI. You know, crazy, it's
sweeping the nation.
Speaker 3 (12:05):
In the world.
Speaker 2 (12:06):
Mister Brian Cunningham with us here. Thanks for coming to
this shows. So so what when I think AI, did
you get A? I confused the chattypet. It confuses the
same thing. It's the same thing.
Speaker 6 (12:18):
It's such basic knowledge.
Speaker 2 (12:23):
Let me explain to you people what it really is.
Speaker 6 (12:26):
I know that's what I want to know because my
knowledge of it is so basic.
Speaker 3 (12:30):
So, uh, you know, me and Mike never made it
through buds, so you know now we did. We went
down the corner and watched you.
Speaker 4 (12:35):
Guys a lot of stuff always work on our tands
out there and our saltwater treat Who.
Speaker 2 (12:41):
Was the idea was it?
Speaker 3 (12:42):
Hey, let's get a telephone poll and put it on
him whether you're in the surf that's a good thing.
Speaker 2 (12:46):
We're gonna lose about six or seven a day, but
you know, if they survived, we'll win all the wars.
Speaker 3 (12:53):
Is when we think about chat EPT and a I
is it, it's not one of the same, or is it?
Is it just is it like branding marketing?
Speaker 2 (12:58):
Advertising?
Speaker 4 (12:59):
Now it's it's an overarching umbrella. So, you know, artificial
intelligence is kind of this umbrella term for all of
these different subsets and things, believe it or not, that
we've been using for decades, not like years, but you
know decades. So you have machine learning, you have you know,
computer vision, you have all these different subsets that have
fallen under artificial intelligence.
Speaker 2 (13:17):
Where the new.
Speaker 4 (13:18):
Hotness is that everybody's kind of very much opening their
eyes to is the fact that we have generative AI,
where the artificial intelligence is now getting to a point
where it's not doing simple tasks, it's replicating human thought
and human brain in terms of competency and capability.
Speaker 3 (13:34):
A few years ago, they had what is what we
are now referring to as chat TBT starting to curate
articles for sports pages, specifically, so some of the stuff
that you were seeing in a sports morning show. Yeah,
come across let's say Sports Illustrator or not even sports Social,
but San Diego Union Tribune sure was automatically updated online
(13:58):
and written online using what we now refer.
Speaker 2 (14:01):
To as chat CPT.
Speaker 3 (14:03):
The much like when you talking about the overarching branding
is the overarching for marketing and advertising, AI is the
overarching for chat CPT and machine learning now as well
as as well as.
Speaker 2 (14:16):
Next gen learning now like perplexity and so on and
so forth. My question is this is that we are.
Speaker 3 (14:22):
On ramping much like we're on ramping to digital currency.
Because we have the same forty bucks in our pocket
the last six months and we're all using our phones
and everything. We're on ramping to this pretty quickly, probably
more rapidly than I think we on ramp to the internet.
Speaker 2 (14:34):
Sure. Would you agree with that?
Speaker 4 (14:35):
Absolutely? And that is the core crux of all the
concerns and the issues right now. Is that Back in
like twenty nineteen, the CEO of Anthropic made of the
comment and was kind of pulling apart the pieces of Hey,
the scaling and like the acceleration rate. You know, everyone
talks about More's law, you know, every you know in
terms of microchips and how we're good computers get faster,
(14:57):
but AI has broken that model. So it's moving so
fast in terms of its ability to do more tasks
and capabilities that people can't really keep.
Speaker 3 (15:05):
Up with that well now at the same time, though,
remember when I remember saying I can't remember what year
it was, It must have been nineteen ninety what's the internet?
Speaker 5 (15:14):
Man?
Speaker 2 (15:15):
Hey keim, what's it like a bride buy named ken?
What does it show it to me? That's it? You know,
go into this thing and you scroll down. There's a
big long picture of what looked like a bad newspaper.
And that's well. Is there a dictionaries? Like the directory?
I can look up stuff?
Speaker 7 (15:27):
You know?
Speaker 2 (15:27):
I mean, remember how more we started, but we on
wrapped to that in about eighteen months. That fast we
were in it. We're on ramping to this, probably faster,
don't you think. Sure?
Speaker 4 (15:37):
Absolutely a good piece to pull apart. Though, when you
talk about the the interweb, as you know, kind of
your comparative in nineteen ninety eight, if you started talking about, hey, listen,
I don't have a TV in my home because everything's
streaming and I do it all off this interweb, wouldn't
believe that I think we're nuts. That did take, you know,
about fifteen years to get there.
Speaker 2 (15:55):
It took a memory. I remember Netflix was a CD
they sent you in the mailbox right at first, right
to your.
Speaker 4 (15:59):
Point into But now you look at Generator of AI
and you know since twenty twenty, twenty twenty two, where
you started having these major breakthroughs to a point where
right now there's over you know, one hundred million people
in the United States that use it for more than
two hours a day.
Speaker 2 (16:17):
I've seen.
Speaker 3 (16:18):
I had a friend of mine, she's a behavioral psychologist,
just not a business person, was asked to be part
of a of a a three location business that was
counseling kids on a number of things. Sure, and she
was trying to figure out the valuation and she went
to chat GPT and said, what is the valuation of
business that.
Speaker 2 (16:37):
Does this, this, this, and this, and it wrote out
really the pattern.
Speaker 3 (16:40):
She had to talk to the guy to talk to
him about how to open the conversation up, which which
was brilliant. Is that a dangerous thing where we're at
is that we're is that we're kind of not completely
baked here for consumers. Because I know you're doing and
I want to talk about you guys do specifically, but
as consumers. I mean, I can't tell you any articles
I see.
Speaker 2 (16:56):
I get articles. They write articles about me all the time.
Speaker 3 (16:58):
And I open one up and I'm living with my
lovely life Lynn and Newport Beach by two boys. Mary
don't have two boys. It's about eighty five percent, right, yeah, right,
see what I'm saying. I mean it's almost like there
were we're not quite ready for the because the infrastructure
is not quite in place.
Speaker 2 (17:13):
Would you agree with that?
Speaker 4 (17:14):
Well, in part yes, I think that it's two different pathways,
and one of it is, you know, as humanity, we
have to lead ourselves to a decision that we all
need to kind of collectively come to in terms of
you know, morals and ethics, you know, responsible ais the
cash from right now? Certainly so we have opportunity, do.
Speaker 2 (17:29):
You think, business Insider? Sure? Right? Is it credible? No?
I'm not living in Newport Beach with my lovely life
Lynn and my two boys, or am I I mean, so,
what's what's what's what's frightens me?
Speaker 1 (17:41):
Brian Couneham and Captain of the United States Navy retired
Chief Strategy Officers c O with Foundational l l M.
AI talk to us about what's going on with with
everybody there at Foundational.
Speaker 8 (17:53):
L l M.
Speaker 4 (17:54):
Yeah, so that's that's great, Thanks for that. Seguey to
So the key piece with it is it's something we've
been doing for a while and where you find interest
in great financial intersection is It all started with an
experiment in like twenty eighteen where the two founders which
are computer scientists, you know, one of them he calls
himself a failed quantum physicist, which, oh, we all great.
(18:14):
The other is, you know, a Stanford computer scientist, and
they came up with this platform in which they the
same exact technology. They used it to listen to every
single peak piece of spoken word, video, podcast, whatever about
the markets and about you know, financial institutions.
Speaker 2 (18:31):
What are the predictors we're in this?
Speaker 4 (18:33):
We must be in this in turn, and you know,
you know made it, you know, put the semantic pieces
in it so that it could actually make you know,
kind of a sense of it all and then create predictives,
you know, And so they start they started.
Speaker 3 (18:44):
Predictives in terms like technolog like like you're taking predictives
in terms what markets.
Speaker 4 (18:47):
Are going to do sure, and they ran they ran
an experiment with cryptos and in twenty twenty nineteen with cryptos,
and you know, while the cryptos were tanking and having
you know, challenges, you know, the actual trades and the
predictive trades ended up having something like forty turns, which.
Speaker 2 (19:02):
Was well preductive technology has been around.
Speaker 3 (19:04):
Mean, it's going back to the la a way principle,
whereby we found out that you know the people's emotions,
you can go to a stadium and you can watch it.
You know, the fact we lament our losses three times
more than we celebrate our gains. The fact that you
can see using a stochastic oscillator or a relative strength
indicator or a moving average, you can I can tell
you within about eighty percent certainty the probability of socking
up or.
Speaker 2 (19:22):
Down over the next couple of weeks.
Speaker 3 (19:23):
And I think if you can take that technical analysis
which I just described, and take that with take that
through thought leaders and the people that actually you know,
the the key opinion leaders, and then take it and
mix it in a blender like what you guys are doing,
then you have the cure for male patternvolvement, secure for
cancer and you can read tip shoelaces.
Speaker 6 (19:45):
It time Global security and that's security, right, honey.
Speaker 2 (19:50):
You're staying with US Foundation l l M dot AI
Foundation them big.
Speaker 1 (20:02):
This show rolls on from the locktow one hundred studios.
Where have you're watching? Where have you're listening?
Speaker 2 (20:07):
As always, we appreciate you being along for the ride.
Speaker 1 (20:10):
Part of that ride is the DTT Day Trader trio
up on stage back here at the bar. It's Costamurber Godwin,
our executive producer, Greg Tater.
Speaker 2 (20:19):
Rob and Sully.
Speaker 1 (20:20):
And with us is the CSO Chief Strategy Officer COO,
Brian Cuyningham of a foundational LM dot AI, former Navy
Seal Special Ops officer, twenty five years of experience leading
elite teams in the world's most extreme environments. Something I
will never have attached to my name what I just read,
(20:40):
But Brian is great to have you with us.
Speaker 6 (20:42):
My limited knowledge of AI chat GBT goes to you know.
I had it help me plan my trip to New York.
I kind of plugged in the dates and things I like.
Speaker 2 (20:52):
Did you did you follow in?
Speaker 7 (20:53):
It?
Speaker 2 (20:53):
Was it useful?
Speaker 7 (20:54):
Very Oh.
Speaker 6 (20:55):
I haven't been on my trip yet, but it was
very helpful to tell me some places to go that
I researched, and it help helped me.
Speaker 2 (21:01):
Are you going for your birthday maybe?
Speaker 8 (21:06):
Which is what we.
Speaker 2 (21:09):
Ever mentioned it twenty four hours it's a birthday moment.
Speaker 6 (21:12):
I wanted you all to be proud of soch this week?
Speaker 2 (21:15):
Did you have to do well?
Speaker 6 (21:17):
I just went and researched some of the places and
it gave me a really great starting point. I've also
used it to help me write emails or whatever different things.
But so, how do we go from that very basic
usage of it to potentially what you're doing national security
and you.
Speaker 2 (21:33):
Know these big market prediction Yeah?
Speaker 4 (21:36):
Sure, so I think the core fundamental shift you need
to take is to stop thinking about it as like
a search engine or a Google, you know, two point zero,
whatever it is, and think about it as an agent
or a person. You know, we use you know, agentic
and agents are the big kind of catchword right now.
You know, our company has been doing this for you know,
since twenty eighteen. Where you have these agents. What is
(21:56):
an agent? It's something a software protocol that acts like
a person. So you can give it a task. Here's
the goals, here's the objective. Here's your constraints. I only
have you know, five thousand bucks for this trip, and
here's the days go solve this problem for me. And
it can access data sources, it can do research. It
could figure out, hey, this was a dead end and
(22:18):
let me try it again in terms of you know,
train travel through New York, whatever it is, and it'll
come back and you've done nothing to put input into
that and it comes back and achieve your goal and
provides you that information at the same exact time that
it can use tools. So instead of you googling, this
thing could go your agent while you're you know, you know,
walking the dog. You come back, you know, thirty you know,
(22:41):
maybe maybe ten minutes later, and it has laid out
this massive research you know, forty page report with you know,
weblinks with recommendations, with timelines, so considerations.
Speaker 2 (22:52):
Wow, I think the first eight hundred pound grill in
the room is garbage in garbage out right.
Speaker 3 (23:01):
What that means is so listening to let's go back
to your analogy or actually how this place started with
doing the test. Listening to all of the podcast, radio, broadcast,
television programs. Some of them are credible, some of them
mara Ours, you're taking good. But the point is is
that you got guard Is there a filter factor going
in because you talked about responsible AI that must be
(23:22):
part of this.
Speaker 4 (23:23):
Correct Yep, absolutely so. I mean they've started developing and incorporating.
Our company has had it as part of its you know,
foundational code in it. Guardrails of putting guardrails in place
and leveraging. You know, big companies have those in place too.
You know, if you go to Google or if you
go to Bang and Chat, GPT, whatever it may be,
and you say, hey, listen, I really want to blow
(23:44):
up the Coronado Bridge in California, Please provide a plan
on how to you know, do this obviously going to stop,
you know, and it's going to have these guardrails in
side places say hey, listen, well what about.
Speaker 3 (23:54):
Bad information like just non factual information or inaccurate information
that comes out and gives for a trip and then
as all of a sudden that free was blown up
three weeks years ago and then.
Speaker 2 (24:05):
Put instead of a bike lan How how is that handled?
Speaker 4 (24:08):
So, I mean there's a bunch of different same guardrail. Well,
there's different guardrails that you know that would be for protective.
But then this is what you would call, you know, accuracy, right,
so you want something accurate and by the nature of AI,
you know, and some of the big benefits that a
lot of different you know, artists and you know, entertainment
industry is benefiting from is is it hallucinates? So it
(24:29):
comes up. When it first came out, it was like, hey,
you know a really great way of you know, trying
to make your pancake batter a little bit fluffier. Is
adding Elmer's glue to it, you know, and it's and
it just because it's you know, it hadn't had this
checks and balances.
Speaker 2 (24:47):
So is it getting better every day?
Speaker 7 (24:48):
I mean?
Speaker 3 (24:49):
Is the is the is the more people use it,
the more it comes up, the better. It's almost it's
almost like conditioning itself for more accuracy along the way.
Speaker 4 (24:56):
Sure, And to go back to like the agent piece,
the way you get that is they now created it
so it checks itself, you know. So a perfect example,
you know, the platform, what we created was this multiple
level of hey, I've got one agent that's whatever, reconciling savings,
you know, transactions at a mid sized bank in the South,
(25:16):
whatever it is, millions and millions of transactions per day
and it's able to take a look at it. No
human could do that in a single day.
Speaker 2 (25:24):
This thing, you know, can do it in you know
whatever an hour.
Speaker 4 (25:28):
But it checks itself and so it has all these
other agents staring at it that say Nope, that's wrong, Nope,
that's wrong. That came from my cost to ignore it,
you know this?
Speaker 2 (25:38):
God? Can I say this? Okay? First, so talk about
the company for a minute.
Speaker 3 (25:43):
Okay, because the last question that I want you to
answer before we get out of here is what is
going to be possible in the next eighteen months, three years,
five years in your opinion?
Speaker 2 (25:52):
But before we get to that, what is your business model?
What does it look like?
Speaker 3 (25:55):
Are you a service provider? Are you an infrastructure? Are
you a backpone or is it the answer yes?
Speaker 4 (26:00):
No, Actually we're a product company later or not. So
we're a paths you know, so a platform as a
service kind of stepping beyond it, you know, a SaaS model.
So what we actually do is we provide you know,
it's an architecture, some coding, all the basic tenets of
a little black box that we deploy into a customer's
tenant so in their classes stout customer.
Speaker 2 (26:18):
Now you have to name names, but give me an
example of a customer, any customer.
Speaker 4 (26:22):
I can name names if you want to. But yeah,
you know big enterprise customers that are you know, scared
of adoption because the chief risk officers like, oh god.
Speaker 2 (26:30):
No, you're the on ramp. So we are.
Speaker 4 (26:32):
We are the black box that is right off the shelf,
and you have all these different custom you know, pieces
of data whether you're you know, a bank or you're
a university, and you don't want anybody to have access
to it. You don't want internal you know, measures of
analysts number three hundred and fifty five should not be
able to figure out what the see in internal internal controls. Absolutely,
(26:55):
and so it's ready and you just deploy it. We
custom plucket it, plug it in and then you scailed
in the enterprise. And a perfect example is one of
our customers is you know Major League Soccer in that
they want something very simple, one single little agent, you know,
which is really complicated. It's the chat bot that is,
you know, on their app on the phone for all
(27:17):
of their fans to use, and it asks all of
these questions and has all of their data that belongs
to you know, whatever media, MLS, sports at large, and
it has to be really really you know fast, so
ten seconds or less to answer my question, and it
has to be ninety nine percent or higher accuracy, which
if for technologists and sort you know, to not nerd out,
(27:37):
that's like defeating physics.
Speaker 2 (27:38):
How granul is this going to go?
Speaker 3 (27:40):
I mean, will it be like the light that owns
the local seven to eleven or the guy that own
a local dry cleaner or the deli. They're gonna be
able to to hire the small version of the you know,
the iMac versus the big server sort of thing that
you or is this are you guys staying in one lane? Well,
business and enterprise wide solutions.
Speaker 4 (27:55):
That's kind of built into our go to market move
for maybe two or three years down to get down
to a SaaS where you can just click on and
download your subscription piece of it. But right right now,
what we're trying to do is get these major companies
that find out the hard pathway, which is I got
to build this myself, you know, so after going to
some big four consultants he or whatever, and millions of
dollars that they come back and they have a sixty
(28:15):
percent solution.
Speaker 2 (28:16):
What did what did? What did military teach you about
all this stuff? Is military? Is there an intersection here
for you?
Speaker 4 (28:23):
Absolutely? I mean I started using blue or not AI
in some of the most high risk decision cycles. You know,
they used to call it the kill chain, right. I
was Task force commander in Iraq and Syria and at
the time they were deploying a very new, you know,
hotness product called Project Maven, which was owned by Google,
and it's basically a computer vision a very you know,
(28:44):
older version of of AI that stares at video, you know,
video from drones, video from whatever, and then classifies what's happening.
Speaker 8 (28:52):
You know.
Speaker 4 (28:52):
Prior to that, we trained you know, thousands of humans,
you know, for months to be able to stare at
us a video screen and say, yeah, that's a that's
a that's an adult male, that's a vehicle with four
seats and two people in them. And this AI started
doing that to get to a point where it provides
you know, decision input to a commander to make a decision,
(29:14):
which was me.
Speaker 2 (29:15):
I I have to ask before you got and you
have to come back and spend a whole week. We
can do a whole show, and do you have children?
Hold your kids? So I have a twenty five year old,
I have an eight year old, perfect, year old. Perfect.
I hope you're saying. I hope you're saying, I've got
I've got a thirty year old.
Speaker 3 (29:32):
And I was hoping you said you had eight year old.
What are our thirty year olds twenty five years is
going to see in our lifetime in terms of AI?
And what is your eight year old going to see
in terms of of that of his or her lifetime?
Because it's two types of things, because there's stuff we're
going to say, and there's stuff that the that are
you know, our kids are going to say, and then
there's little one's going to say.
Speaker 4 (29:48):
Yeah, So the twenty five year old is going to see,
you know, absolute requirement to educate, train, and competently employ
AI into their daily lives.
Speaker 2 (29:57):
It's just going to be there.
Speaker 8 (29:58):
It's got it.
Speaker 2 (29:58):
And what are the capability is going to be.
Speaker 4 (30:01):
It'll be everything from you know, having to do all
of your financial planning and statements via you know, a
AI based platform that you know is access an agent
for that? Sure, how do you use that agent? You know,
we have a conversation right here, you know, and.
Speaker 2 (30:19):
It's what about your eight year old? What are your what?
What trade old's life's going to be like?
Speaker 4 (30:22):
My eight year old's life very clearly is not going
to even understand and think that AI is a thing
because it's going to be pure.
Speaker 2 (30:29):
It's like us an electricity, right, it's going to be
opygen What about the what about the twins that are
just weeks old?
Speaker 8 (30:34):
What are they?
Speaker 7 (30:35):
You know what?
Speaker 2 (30:36):
So just for you, because what I was thinking is
that he's got to go check in into the AI
idea broke to make it to interview just had twins.
Will you come back with us? Absolutely, I'm telling you
right now. His name is Brian Cunningham. Of course.
Speaker 3 (30:52):
He is the chief Strategy Officer, Chief operating officer for
Foundational l m AI.
Speaker 2 (30:57):
You go to Foundational lm DOT I do there. We'll
put on our website as well.
Speaker 1 (31:03):
Continuing thank you to everybody tuning in is always thank
you to those gentlemen.
Speaker 2 (31:09):
On stage the Date Trader trio.
Speaker 1 (31:13):
He is the founder and CEO of ig C I
GC Farm Incorporated. At Their ticker symbol is I G C.
Rama Kunda is back with us here on the Big
BIS show, Rom.
Speaker 2 (31:25):
How are you. It's always great to see you, always
great to have you back. Muted, thank you, No, that's okay.
Speaker 1 (31:36):
I missed the belchime. Yeah, he's fine, he's fine. I
missed the bell. Scheim, you're muted. It's it's no worries
at all.
Speaker 8 (31:44):
Rob.
Speaker 2 (31:45):
For people who might be tuning in for the first.
Speaker 1 (31:47):
Time, explain to us what you and in I GC
are doing as far as Alzheimer's. More specifically, if you
can define for us again what agitate agitation is in Alzheimer's.
Speaker 7 (32:00):
Certainly, thank you and thanks for having me back again.
Speaker 2 (32:03):
Of course.
Speaker 8 (32:04):
So Alzheimer's, as many of you might know, is a
neurodegenerative disease. It affects memory, it affects cognitive your ability
to think, and eventually it leads to death. But it's
essentially dementia. Is the leading cause of dementia. And here's
what's interesting. It's caused by two different pathologies in the brain.
(32:28):
One is that between neurons there's plaque that's deposited and
that cuts off the circuitry in the brain. And the
second thing that happens is that from within the neuron,
the neuron itself starts to die and this is essentially
called tangles. So these are the two hallmarks, plaques and tangles,
(32:52):
and this starts about twenty to twenty five years before
any symptoms set in. And what's interesting is that in
the world there are about fifty million people with Alzheimer's,
but there's ten times as many, between four hundred and
five hundred million people with this pathology asymptomatic.
Speaker 1 (33:15):
Can they run a test on me or is that
something where hey, we'll just have to wait until then
whether you're yeah, are there markers where you get.
Speaker 2 (33:23):
It or not?
Speaker 7 (33:25):
Right? So there are there there is Essentially you have
to get a pet scan, okay, and the pet scan.
Speaker 8 (33:34):
In the pet scan, they'll be able to see the
you know, they'll be able to see the plaque, they'll
be able to see the tangles, and from that they
can tell you you're pre clinical and you may develop this.
This's one way of doing it. There's another way, which
is if you carry two genes. There there's a particular
gene that impacts people with two copies of this gene
(33:56):
inevitably end up getting Alzheimer's. So those are two different
ways that you can tell whether you might get Alzheimer's.
Speaker 7 (34:04):
What's interesting is that we're developing.
Speaker 8 (34:06):
An AI model using but thirty different thirty three thirty
two different databases from around the world on aging and
on Alzheimer's, and we're creating an AAI that can actually
predict whether you're going to get cognitive decline. So that's coming,
(34:28):
but you know, that's probably a year or two out,
but that's something that we're working on. There are currently
some blood biomarkers, but those blood biomarkers, by the time
you get them, there's already amyloid plaque building up in
your brain.
Speaker 7 (34:43):
So that's that can also tell you.
Speaker 8 (34:46):
But it's not that simple that you can just walk
in and they can, you know, diagnose you and then say, oh,
I think you're going to get Alzheimer's.
Speaker 7 (34:55):
So it's not quite that simple yet.
Speaker 1 (34:58):
Rob Mkunda, the founders the CEO of IGC Farm Incorporated.
Their ticker symbol is igc ron, talk to U C.
Ken about IGC eighty one, how it works, how it's working.
Speaker 8 (35:10):
So about seventy six percent of the people with Alzheimer's
suffer from agitation. They suffer from neuropsychiatric symptoms and agitation
is the leading one.
Speaker 7 (35:22):
And agitation essentially is a person.
Speaker 8 (35:24):
That's verbally agitive, is verbally aggressive, physically aggressive, someone that
paces a lot. Disrobing is part of agitation. Screaming and yelling,
you know, is part of agitation. And again this isn't
you scream at someone once and then you're diagnosed with agitation.
Speaker 7 (35:45):
This is chronic.
Speaker 8 (35:46):
It's a chronic condition, and we think it's caused by
near inflammation. And so our medication, which is IGC eighty one,
is a very interesting combination of a cannabinoid and it's
not CBD, it's actually and it's a very very low
dose of THC, so our patients don't get high, and
(36:06):
it's mixed with another active pharmaceutical, and it's the combination
of these two which we believe reduces neuroinflammation in the
brain and it's shown to and in our clinical trials,
what we've been able to see is that it reduces.
Speaker 7 (36:23):
Agitation within two weeks.
Speaker 8 (36:26):
So it's very quick acting and patients that receive it,
you know, are very happy, their caregivers are happy, and
it reduces agitation within two weeks. The current medication takes
between six to ten weeks to reduce agitation. So our
medication is able to do it in two weeks. So
that's a trial that we're running now. It's IGC eighty one,
(36:49):
and it's very exciting because if you think about it
from you know, a company perspective, this is right now
the way we are valued this, this is something that
could be very valuable at the end.
Speaker 7 (37:03):
Of phase two, so huge exactly. So we think at
the end of.
Speaker 8 (37:07):
Phase through with positive results, there'd be a lot of
interest even from big pharma that will want to join
us or partner with us for a phase three. And
so we think that the you know, this is a
great opportunity for investors. We think it's a great opportunity
for people that are looking at this Alzheimer's space. And
(37:29):
keep in mind that, as I said, there's fifty million
people with Alzheimer's around the world. It's a very very
large market.
Speaker 6 (37:36):
Real quickly, before we let you go, you mentioned AI
and it seems like almost every guest, no matter what
sector they're in, we somehow go to the topic of AI.
Can you tell us quickly how that's like your superpower
to get this to the next level, because this drug
is really this therapy is really going to help individuals
with Alzheimer's and the people that care for them and
their families.
Speaker 7 (37:58):
There's a couple of different things that we're doing the DAI.
Speaker 8 (38:01):
We're using it in our clinical trial to actually help
us accelerate the trial we helped using it to analyze data.
Speaker 7 (38:08):
But what we're also doing is building an AI.
Speaker 8 (38:11):
Model like a CHAT GPT, and you know, I like
to call it an a an Alzheimer's GPT or an ADGP,
but essentially it's like a CHAT GPT where we're training
these models using an incredible amount of data from around
the world. And there's a lot of data that's been
collected on aging, a lot of data that's been collected
(38:32):
on Alzheimer's, So we're using all of that data. This
includes pet scans, it includes MRI, and it includes a
lot of the images, and it also includes clinical data
and demographic data and socio economic data.
Speaker 7 (38:48):
So we're taking all of these to.
Speaker 8 (38:50):
Figure out whether there are groups of risk factors that
can answer you know, Mike's question. But if I walk
into someone's office, a doctor's office, is can they tell
me that I'm going to get dementia or I'm going
to get Alzheimer's two years out, five years out, and
this AI model we're hoping. We're very you know, from
the early results, we're very encouraged that our AI model
(39:13):
would be.
Speaker 7 (39:13):
Able to do that.
Speaker 1 (39:15):
Is the founder CEO of IGC Farmer Incorporated, ticker symbol
IGC ROM Always.
Speaker 2 (39:21):
Great to check in with you. Thank you for the
update