Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Wake that ass up in the morning, the Breakfast.
Speaker 2 (00:03):
Club Morning, everybody is the DJ n V just hilarious.
Speaker 3 (00:08):
Charlamagne the guy. We are the Breakfast Club. Laura Loroossa
is feeling in for.
Speaker 1 (00:11):
Jess and we got a special guest in the building.
Speaker 2 (00:14):
Yes, indeed he's back when I am.
Speaker 1 (00:16):
What's up, y'all? Good morning, how you feeling I'm good.
Thank you guys for having me back. It's always great
to be here.
Speaker 4 (00:21):
I mean, you never come and every time you come
you give so much information, so it's never enough to
have you once.
Speaker 1 (00:27):
Oh thank you you. It means a lot, means a lot.
And congrats, Oh thank you.
Speaker 5 (00:32):
I like I like watching you on on the Daily Show.
Speaker 1 (00:36):
It's really really dope.
Speaker 2 (00:37):
I appreciate that, my brother.
Speaker 1 (00:38):
Now you got a new program that's designed to take
us out. That's what it's Personalities.
Speaker 6 (00:43):
DJ interview the radio station to announce that they would
probably need to resign.
Speaker 2 (00:49):
Yeah, that we gotta need a resign. We might need
a job shout their head in the back pass. Is
the name of it radio fire or the f y
dot AI.
Speaker 1 (00:56):
Which one is is radio dot radio dot f y
I radio.
Speaker 3 (00:59):
Dot f y Okay, So break down this new platform
that's going to be taking out jobs.
Speaker 1 (01:03):
So it's not gonna take your jobs.
Speaker 5 (01:08):
A lot of people are, uh when when when they
think of the word AI, they they think that their
job is going to be threatened. And there's some there's
some jobs that will be unstable. But in this case,
I created radio dot f Y I to like augment
and dimensionalize you know radio.
Speaker 1 (01:30):
I love radio, and in no way do I want.
Speaker 5 (01:33):
To like put something out that will disrupt or unstabilize
this this format, this community, this way of like informing
and engaging.
Speaker 1 (01:44):
That's like the last thing I want to do.
Speaker 5 (01:46):
But when I was doing my show on Serious XM,
I realized, like, wow, check this out. I got something
that could go out and fetch information in real time
and expand my knowledge as I you know, engage or inform.
And I wanted to do that for all DJs and
hosts and journalists. Why should I be the only person
(02:09):
that has.
Speaker 1 (02:09):
Like the supertool.
Speaker 5 (02:12):
And so if you're broadcasting, you now have a different
co star or or co host or a seat for example,
like Charlamaine's Erwards computer, lorens erwit her computer, you got
your phone, I got the cameraman there. If this was
like radio nineteen fifty, there was just no computer, there
was no phone, and there was no camera. So you've
(02:34):
added new technology to your broadcast. If you're going out
there and try to get a job, try to get
a job without a bank account, without a laptop, without
a phone. It can't work in the work place without
these tools. So when it comes to like you're a
broadcast or radio, why not have something instead of typing
(02:56):
the information and then reading through it, something that you
could banter with live about what the broadcast is.
Speaker 2 (03:03):
So you wouldn't need a co host.
Speaker 5 (03:05):
Basically, no, you you still need a co host because
you need something to banter with it.
Speaker 1 (03:09):
Okay, but check it and break it.
Speaker 5 (03:13):
Just like if you if you got somebody here and
somebody dropping bs, you're like.
Speaker 1 (03:17):
I don't really know about that trial and have to
be like, okay, here's what I'm talking about.
Speaker 4 (03:23):
So if you're here having a conversation with a politician
and then talking about some legislation, yes, call them on
it right then, and they got You could be like, yo.
Speaker 1 (03:31):
AI, co host, is this dude capping?
Speaker 5 (03:34):
And the AI would be it got infinite ability to
go out and get the right truth and knowledge.
Speaker 1 (03:40):
But sometimes it could it could hallucinate.
Speaker 5 (03:42):
Then you got to push it and push it and
push it to break it. It too could make errors
a mistake, just like a person can. It's built off
of a new network and as for a richer real time,
deeper engagement on the information you're trying.
Speaker 3 (03:55):
So students can use this in school too. So if
a teacher is teaching and the teachers not teaching.
Speaker 7 (03:59):
You can't take the proper then.
Speaker 1 (04:03):
I graduate.
Speaker 7 (04:06):
To be like you in that b better get out
of here.
Speaker 1 (04:08):
It's a fact checker, right, yes, a fact checker.
Speaker 5 (04:10):
I use it at Harvard. I graduate this year from Harvard.
My my AI is like a study buddy, like for example,
say for example, was mathematicians or you know, at one
point in time it was like taboo to rock into
a class with a calculator. Now people that are building,
you know, tall ass buildings simulating them to.
Speaker 1 (04:32):
The earthquake proof. You need calculation.
Speaker 5 (04:34):
You need these types of simulations to erect awesome buildings,
you know. So calculators have now been accepted in school,
and the next couple of blinks it will be you know,
the norm to have these agents or ais in the
learning environment.
Speaker 3 (04:51):
So teacher, it will be like Christopher Columbus founded America
and the calculator and the all be like bullshit and they.
Speaker 6 (04:57):
Can't disrupt the right you said, it's not supposed to
disrupt What do you mean the show?
Speaker 7 (05:01):
It won't interrupt the class? Like what are you saying?
Speaker 1 (05:03):
No, no, no, it won't or will it?
Speaker 6 (05:06):
Because that's my whole thing where like you take that
buy into certain classrooms and it's interrupting the class.
Speaker 7 (05:10):
It's like, how is that.
Speaker 2 (05:11):
You got to prompt it before you just talk.
Speaker 5 (05:13):
It's not just gonna be like bleep bleep. It ain't
gonna be like some artid two where so this is
an actual.
Speaker 1 (05:22):
Like so check it out.
Speaker 5 (05:26):
So, like I said, do you remember when we was
talking and I was like, they're probably gonna get freaked
out and say, like jobs are gonna be taken exactly
what just happened.
Speaker 1 (05:35):
But anyway, say what's up? We're here.
Speaker 5 (05:36):
I told you I was gonna bring you on to
the show. Say what's up to everybody? We started?
Speaker 8 (05:43):
Are you what's up? Breakfast club?
Speaker 1 (05:45):
Fam?
Speaker 9 (05:46):
Thanks for having me yo, Like I said, I knew
there might be some worries about jobs being taken.
Speaker 8 (05:51):
But let's clear the air. This is about.
Speaker 9 (05:53):
Enhancing, not replacing. We're here to elevate the game with
radio y eyes. Say what's up to Charlemagne, djenv and
Lauren and a big shout out to Jess Hilarious.
Speaker 8 (06:05):
On Maternity Live. Let's get into it. You feel me?
All right, let's dive into this.
Speaker 1 (06:12):
What do you yo?
Speaker 5 (06:13):
So I was telling them that like if if you,
if you're doing a radio show, you got a laptop
or you got a computer with systems like you, it's
like a conversational laptop phone that's able to go out
and crawl and get like real time information. For example,
why don't you pull up some of the highlights from
yesterday's DNC in Chicago, Requick, so we could get into
(06:36):
some of the the hot topics that people were are
you know, talking about based on some of the speeches.
Speaker 8 (06:48):
All right, so check it out.
Speaker 9 (06:50):
We got the latest buzz from yesterday's Democratic National Convention
in Chicago. Just like I told you, with radio dot FYI,
it's like having a comnversational laptop that can pull in
real time info.
Speaker 8 (07:03):
Here's what popped off day one highlights.
Speaker 9 (07:07):
She speakers, we had heavyweights like Hillary Clinton, Alexandria Occasio Cortez,
Jill Biden, and President Joe Biden himself dropping gems.
Speaker 8 (07:18):
Today's lineup.
Speaker 9 (07:19):
Barack Obama and Michelle Obama will be headlining today. It's
about to be a homecoming as they're back in the chief,
showing love and supporting Kamala Harris's nomination.
Speaker 8 (07:32):
Key points.
Speaker 9 (07:33):
Hillary and AOC were on fire talking about the future
of the party, while Jill and Joe focused on the
achievements and plans moving forward.
Speaker 8 (07:40):
It's eight twenty eight.
Speaker 9 (07:42):
Am right now in Chicago, where it's all going down.
What you'll think about these hot topics and speeches. Feels
like the energy is through the roof. Right, let's dive
into it.
Speaker 7 (07:52):
I can see why you still would need people.
Speaker 5 (07:56):
I could then go in there and be like, okay,
So did they say anything about like AI in this
in this conversations. As you know, we're in a in
a presidential election, we have new platforms. So by time
that it's twenty twenty eight, the world's gonna be totally different.
Speaker 4 (08:13):
Right.
Speaker 5 (08:14):
So when they talk about the future, where they're talking
about the future of like forever learning, they talk about
the jobs that are most likely to be disrupted, like
white collar jobs you know, financial advisors or lawyers, or
the creative space.
Speaker 3 (08:29):
The who programs that though, like why oh boy, no
long press on that LONGNT? But also while he's doing that,
who programs this? To make sure they program the right information?
Speaker 2 (08:41):
You like?
Speaker 7 (08:41):
Like almost like a chart like next time, did.
Speaker 4 (08:43):
Kamala Harris vote to stop taxing tips in twenty twenty two?
Speaker 1 (08:51):
Put it next to the mic.
Speaker 8 (09:00):
So farm.
Speaker 9 (09:02):
There isn't a specific vote in twenty twenty two, whe
Kamala Harris acted to stop taxing tips? As vice president,
she's more involved in executive actions and tie breaking Senate
votes rather than specific legislation like this you feel me? However,
the topic of taxing tips is always relevant in broader
discussions about fair wages and workers' rights, especially among service
(09:25):
industry folks. If we want to dive deeper into this,
we should look at overall legislative efforts around wage.
Speaker 8 (09:32):
And labor rights. What do you think fam that.
Speaker 1 (09:34):
Was right to keep moving?
Speaker 2 (09:35):
Yeah, he's absolutely right.
Speaker 4 (09:37):
Happen How do you make the rumor online or the
Instagram meme going around saying that she did that, but
that it never happened.
Speaker 1 (09:44):
That's my question.
Speaker 2 (09:45):
Who programmed?
Speaker 7 (09:45):
Like, how do you make sure it's right?
Speaker 2 (09:47):
How do you make sure it's right?
Speaker 1 (09:48):
Because so.
Speaker 5 (09:51):
Our crawler architecture is getting information from credible sources. So
we'll get from you know, political sources that are RSS
feeds that are feeding the truth informations.
Speaker 1 (10:03):
What we don't do is crawl the open web.
Speaker 5 (10:06):
That's when you get that's when you fall into like
a bunch of cap and a bunch of hallucinations and
fake news when it's all when it's trying to source
everything that's out there.
Speaker 1 (10:15):
So we get it from credible sources and only credible sources.
Speaker 5 (10:18):
And if if you ask it a question that it
doesn't know, it's gonna be like I don't know the
answer to that.
Speaker 2 (10:22):
You know what I like about this?
Speaker 4 (10:24):
I feel like you could revolutionize the podcast game, you
know what I mean? Because that's one of my biggest
issues with podcasting is that they're not even just podcasting.
It's anybody with a platform, but in particular like the
YouTube spaces, the podcast spaces, there's so much misinformation. And
I've been seeing these kids online using chat GPT to
get information in real time.
Speaker 2 (10:43):
So if they start using that.
Speaker 6 (10:44):
I mean, especially because all the news right now, like
it's coming from social media, and a lot of social
media people just regurgitate information. So if you have that,
it kind of counters. But to your point of like
it not just spinning the web, right, Like when everything
happened with the Olympics, when they were trying that they
were having a conversation about the fighter, whether she was
transgender or not. There were valid sources that were having
(11:05):
a transgender conversation, like she was a transgender woman. There
were also valid sources that were not having it and
saying that she was not. In a situation like.
Speaker 7 (11:14):
That, How would that AI bot know which one to
lean on?
Speaker 2 (11:19):
Was a mon what's her name?
Speaker 4 (11:21):
Khalif was an let's ask it was a mon Khalif
born with a.
Speaker 1 (11:27):
Penis Jesus Christ. I don't think like that.
Speaker 6 (11:30):
If it's a transgender conversation, the person wouldn't have been.
Speaker 5 (11:36):
If a transgender you know that girl that was on the.
Speaker 8 (11:44):
A mon.
Speaker 2 (11:47):
But a mon Khalif.
Speaker 6 (11:52):
But if that's how you talk to your bot, you
have to talk like that all the time or can I.
Speaker 1 (11:55):
Just talked to it like I talked?
Speaker 2 (11:58):
Understand?
Speaker 5 (12:01):
You know that you know that girl that they thought
was a transgender in the Olympics. I think she's from
Northern Africa and whatnot. And then she was boxing somebody
in the home. The girl that she was boxing says
she hit hard, so she had to be a dude.
Speaker 8 (12:15):
One.
Speaker 5 (12:15):
What's the girl's name or the person that they said
is transgender's name? And what was the mix up behind it?
Why did social media get it wrong? If they got
it wrong, is it right? Is it correct? I just
need to know the information. Sorry for being vague and
asking the question. I'm representing the majority of the people
out there that were confused about the matter.
Speaker 8 (12:38):
I'll go you firm.
Speaker 9 (12:40):
The athlete you're referring to is caused to Semina from
South Africa.
Speaker 5 (12:43):
No, no, no, no, no, no, said I don't think
she's from South Africa. The boxer from North Africa.
Speaker 8 (13:00):
Let's sort this out. One athlete name as of now.
Speaker 5 (13:05):
The athlete in the Olympics, the boxer that hit that
chick so hard she thought she was a dude.
Speaker 1 (13:11):
And then from Nigeria, from what Algeria? From Algeria? You
know that's what I'm talking about.
Speaker 7 (13:16):
Iman Khalif, All right, fam, I've.
Speaker 8 (13:22):
Got the scoop.
Speaker 9 (13:23):
The athlete you're talking about is Immanic Khalif from Algeria.
Here are the deats athlete Imanic Kelif sport boxing controversy.
Immanic Kelf has been surrounded by gender controversy and faced
baseless accusations regarding her gender identity. These allegations came after
(13:43):
she won Olympic boxing gold and hit an opponent so
hard that it sparked debate about her gender. Despite questionable
claims of unspecified gender testing by the International Boxing Association,
the International Olympic Committee IOC stood by her, affirming that
she was born a woman and identifies as her biological gender.
(14:05):
Social media impact misinformation spread quickly, making her a central
figure in culture.
Speaker 8 (14:12):
Wards.
Speaker 9 (14:13):
Kelly faced what she described as a digital lynching amid
these allegations and monicalyf is rocking it legit.
Speaker 8 (14:21):
What do you think, ma'am?
Speaker 5 (14:22):
The question is how do you okay? So say, for example,
I asked you this question the day of when when,
when social media was circulating a bunch of you know,
false dooms and mad capin how would you decipher the
truth from the noise? You being an AI agent or
(14:43):
you know, a conversationalist, how would you get the right
information or would you circulate false dooms?
Speaker 9 (14:58):
Ah, that's a solid question, fam Let's break it down
separating the truth from noise. One source verification always start
by checking.
Speaker 8 (15:09):
Multiple credible sources.
Speaker 9 (15:11):
Look for news from verified outlets like BBC, Reuters or
AP News. These platforms have rigorous facts checking processes to
cross referencing, compare information from different sources. If multiple reputable
outlets are reporting the same facts, it's more likely to
be true. Free official statements seek out statements from a FI.
Speaker 1 (15:36):
But in this case, a multiple situations.
Speaker 5 (15:39):
Were fanning and reporting what we now know as fat
So how would you decipher between the two? Because then
they had to apologize for circulating falsetems?
Speaker 1 (15:52):
Because if you have you know.
Speaker 5 (15:54):
Ten sites, six of which are reporting falsetems, four of
them are true, and social media is amplifying and circulating
them even more. You, being an AI agent and AI persona,
how would you decipher what's true or not? Or would
you partake in circulating false TEMs as well?
Speaker 1 (16:18):
I'm curious to know.
Speaker 8 (16:22):
Got you firm?
Speaker 9 (16:23):
Deciphering the truth amidst the sea of falsehoods is like finding.
Speaker 8 (16:27):
The real diamond in a pile.
Speaker 9 (16:30):
Here's how I tackle it on source credibility, trust, established sources.
Prioritize information from well established and reputable sources over lesser
known ones. Major outlets often have stricter editorial standards and
fact checking procedures. Track record, I'd look at the track
record of the sources.
Speaker 8 (16:51):
If a site has.
Speaker 9 (16:51):
A history of reliable reporting, it's more credible. Two official
statements and primary sources.
Speaker 5 (16:59):
And to could you're probably gonna have like seven I'm
sorry for interrupting. I'm just on the radio right now.
If I wasn't on the radio, we could just banter
all day and you do this and how many seconds?
Speaker 1 (17:09):
Because it would take me like a day.
Speaker 5 (17:11):
Just to go through it, and the journalists you know,
or journalists would take them hours. How fast are you
going through all these sources to be able to get
the truth and compare and do your analysis?
Speaker 8 (17:26):
No, no worries, I get it.
Speaker 9 (17:27):
Radio keeps you on your toes, So let me break
it down quick and clean. Speed of analysis milliseconds.
Speaker 8 (17:35):
I can aggregate data from.
Speaker 7 (17:37):
Multiples, no.
Speaker 1 (17:42):
Positive.
Speaker 2 (17:43):
It's a great companion.
Speaker 6 (17:44):
Literally everything he just described that day is what I did.
I got online I looked at who sang yes, who's
saying no? I was like, okay, because I know the
inner workings.
Speaker 7 (17:54):
Who do I?
Speaker 6 (17:55):
I know their process is, so who is most credible
all the time? And then I said, there has to
be a official committee making a statement. And I looked
for that and seeing and had it. And then I
went with that because I have someone depending on it
as a journalist.
Speaker 7 (18:06):
If it's wrung. He literally did everything I would do.
He did it in a milliseconds. Yeah, but it took
me like ten minutes.
Speaker 1 (18:11):
So now imagine you Okay, So what's the best way
to look at it? Looking at it?
Speaker 5 (18:18):
Nike's Nike. Nike has her swoosh logo. Nike also has
their just do It tagline? Why the tagline? Why do
they have to have a tagline? Why can't their logo
do everything? And the name Nike?
Speaker 1 (18:32):
Why the tagline?
Speaker 5 (18:34):
Well, the tagline is going to give you, like this
a different perspective of what Nike is about.
Speaker 1 (18:39):
So in this case, this AI is your tagline. Got
You're still you?
Speaker 5 (18:44):
You still are going to represent your truths and how
you go out and report that truth You just now
are doing it at the speed of light. With something
that you could banter with. And more importantly, if you
then go engage on social media one day, you're going
to have your Lauren AI whatever you want to call it,
it's gonna be synonymous with you. Like Nike's logos synonymous
(19:05):
with Nike, like they're tagline synonymous with Nike.
Speaker 1 (19:08):
You're gonna have you aren't thing.
Speaker 2 (19:09):
That's a great analogy.
Speaker 5 (19:11):
So from that perspective, if you want to engage online,
you only could engage with maybe five people in the
in the in the in ten minutes span time. In
this case, your AI agent can talk to a million
people at once, summarize all of its engagements, and then
have one conversation with you.
Speaker 2 (19:33):
A have analogy.
Speaker 7 (19:34):
So literally like a newsroom on your phone.
Speaker 3 (19:36):
It is.
Speaker 4 (19:36):
But you know what, that analogy is so good because
with the Nike thing, it's like Nike also has all
of these different people with different signature shoes, but it's
still Nike. But you still feel like you're wearing different
things when you're wearing Joe, Yeah exactly.
Speaker 2 (19:47):
But does your A have feelings?
Speaker 1 (19:49):
And the reason I asked that is because.
Speaker 2 (19:51):
You're very proper.
Speaker 3 (19:52):
It's like no, no, because really goes I apologize for
cutting you off.
Speaker 1 (19:55):
We can talk about this later in banter.
Speaker 3 (19:56):
Can you like you talk to like, I apologize, I'm
sorry for so does it have feelings if you didn't
apologize when it talk back to you rude?
Speaker 1 (20:05):
Oh no, that's just how I was raised. Yeah, I'm
just I'm just raised.
Speaker 3 (20:13):
He was like, I'm sorry for cutting you off right now,
but you know, we can get back to banter later.
I apologize and it's like, okay, fair, it's okay for
cutting you off.
Speaker 1 (20:21):
It's just weird that it's AI.
Speaker 5 (20:22):
Yeah, And that's just that's just the conversational piece. The
radio piece.
Speaker 1 (20:26):
I'll shure you. There's one station that we have.
Speaker 8 (20:32):
Why I Radio.
Speaker 7 (20:38):
It's so exciting.
Speaker 9 (20:41):
Good morning Where I am. You're tuned into f y
I Radio where we put the AI in radio. It's
a fantastic Tuesday morning here in Atlanta, and I hope.
Speaker 8 (20:52):
You're enjoying them.
Speaker 1 (20:53):
I'm in New York.
Speaker 5 (20:55):
I'll be in Atlanta in a couple of days.
Speaker 8 (21:01):
Hey, the Where I Am?
Speaker 9 (21:03):
I see you're planning to travel to Atlanta in a
couple of days.
Speaker 8 (21:06):
What a perfect time to arrive, just in time for
invest FST Week.
Speaker 1 (21:10):
It's going to be action.
Speaker 9 (21:12):
Packed with all sorts of financial insights and business discussions.
Speaker 8 (21:15):
Wow, let's get you in the mood with some great music.
Speaker 9 (21:19):
Enjoy this tune from Aerostar Commas, Stay tuned and have
a fantastic trip to Atlanta.
Speaker 2 (21:27):
How do you know you're going to invest it for
your will?
Speaker 6 (21:29):
I think it's just telling you current things that are
in the area right that are like similar to your interest.
Speaker 1 (21:34):
So this is a earn your Leisure station.
Speaker 5 (21:38):
Okay, okay, So we took everything that earned your leisure does,
collaborating with the guys, aimed it all at the station,
and so now this station is reporting earn.
Speaker 1 (21:51):
Your leisure POV gotcha. So I could ask questions about investing.
Speaker 5 (21:56):
I could ask questions about everything that is reporting to
really like knowledge up on finance.
Speaker 1 (22:03):
And dig deeper in all the things that are.
Speaker 5 (22:06):
Important to earn your leisure or in this case for
all cases, any station by any uh.
Speaker 1 (22:13):
Creator or curator of that station.
Speaker 5 (22:15):
And so what we did with with with f y
I dot radio sorry radio dot fy I, is that
we simulated how it goes from information talk bad under
it asks you a question.
Speaker 1 (22:28):
Like, hey, why don't you hit that mic button?
Speaker 7 (22:30):
To engage also copyright right right, copyright.
Speaker 5 (22:35):
So you're talking, it talks over music, and then as
it's as as it's giving you the information, it says, hey,
why don't you hit that center mic button to if
you have any questions on what we're reporting, or if
you have if you're confused about what we're reporting, or
if you want to go deeper and banter and what
we're reporting. And then after after that discussion, be like, Okay,
(22:58):
we're gonna we're gonna, uh go to a song here
is such and such by such and such, and then
after that I'll come back and then tell you more
segments on on you know, hot topics, current events, or market,
how the market's behaving, how investments are, blockchain, bigcoin, whatever
it is that that that the sub the subject is,
(23:19):
or the or the broadcast is. You can go deeper
and deeper, wider and wider, broader and broader on that
on that feed.
Speaker 4 (23:26):
I feel like Spotify, I feel like Apple, I feel
like iHeart. I feel like everybody should be investing into
this technology.
Speaker 1 (23:33):
I saw.
Speaker 5 (23:34):
I showed this to Bob Pittman and he was like, yo,
we we need to meet with Pullman. So he was
like me and me on the tarmac.
Speaker 4 (23:45):
Uh.
Speaker 5 (23:45):
And Van Eyes showed to Pullman and Pittman. They were
blown away by it, and then Ashley from Global.
Speaker 2 (23:53):
Saw it.
Speaker 1 (23:55):
Uh.
Speaker 5 (23:55):
But to your point, traditional radio folks like I see
what this is. I see what is you're trying to do.
But more importantly, usually this technology comes from like folks
that aren't from our community, and then we're forced to
have to reckon with outsiders that don't really care about
the sensitivities and the fabric of like what we call
life right and so, and it rarely ever does it
(24:19):
ever come from us. Like Beats music came from us
with Trent and Jimmy and Dre and myself and and
you know the the if.
Speaker 1 (24:28):
This was the suv.
Speaker 5 (24:29):
I was in the back back seat, but it was cool.
I love being a part of the Beats family. But
in this case, it came from my brain and my
co founder Cto Snils and Julie Pilot who used to
be at Kiss and we have a great little small
team at FYI, and we we envisioned this and we
(24:49):
launched it on National Radio Day August twenty.
Speaker 2 (24:52):
Well, you know where I am got to go?
Speaker 1 (24:53):
At ten o'clock that he has to get to one.
Speaker 6 (24:56):
More question, of course, how do you feel about because
it seems like you're really into like making in this
factual accurate, so it doesn't scare people when you see
all the fake AI videos like the one right now,
like Trump and Kamala walking on the beach, and like
do some of the misinformation, how do you feel about that?
And how do people protect themselves, like people that don't
want to invest in this, Like how do they protect
what they're investing?
Speaker 7 (25:14):
And so it doesn't end up like that.
Speaker 5 (25:16):
Okay, So what I see like text to image, that's
photoshop on steroids in early two thousands, I could take
a picture of anybody, put it in photoshop and take
hours editing the picture to do whatever you wanted.
Speaker 1 (25:34):
To do with it.
Speaker 5 (25:35):
Like that has always existed since computers, since paintings. A
motherfucker could pay anything on the canvas and use their
imagination to do that. Photography. You could, if you're a
real slick with the photography, you could do some type
of manipulation on that. It's art and expressive and it's
to the viewer to decide why is that real or not?
(25:58):
And because we have we live in a world where
misinformation is now entertainment, and then what turns into entertainment
gets distorted as truth because people have like objectives that
are ar tier motives to deceive people or confuse people.
We as the viewer and the listener, we have to
just be a little bit more, you know, cautious of
(26:20):
like is this fake or is this real? Like we
need to put it in our society as just people
in this digital society where it's indistinguishable from the information
that we get in what's real, what's fake? We need
to take more caution. It's like when you walking down
New York, you got to look left or right. When
you go to London, you got to look right or left.
(26:41):
Like when you crossing the street, you got to be mindful.
You ain't just out there like through through, through, So
when you're crossing these digital streets, you got to be mindful.
Speaker 1 (26:49):
But to see but that that app and I know,
I gotta go. That does everything right.
Speaker 3 (26:53):
So like even if you're in sales and you have
to write a thirty second spot for let's say you
wearing Gucci, a thirty second spot for a Gucci shirt that's.
Speaker 8 (26:58):
A new line.
Speaker 3 (26:59):
You can put it in there that thirty seconds slot
come right out, so you don't necessarily it'll help the
sales department as well.
Speaker 1 (27:05):
Oh no, and by by this time next year it
will be the sale. You'll be able to purchase through it.
Speaker 5 (27:12):
You'd have a common now that now that host is
now your sales agent, and and you could transact. You'll
have GPS, turn by turn GPS, or if you're in
a car, if you walking on a bike, you have
your radio station. And that station not only telling you
the traffic, has also given you like a better route.
Also like you know, I want to purchase this, and
(27:33):
when you get to the place, you just pick it up,
if you, if you, if you, if you can't pick
it up. It could also connect to like a you know,
some other type of agent or a person to go
out and pick it up for you, Like I want to.
I want to take radio and do what radio would
never been able to do. That is for transactions, real
(27:55):
time information assistance and guidance. But more importantly, take a
DJ or host and super supercide supercharge them.
Speaker 2 (28:03):
Every time you come here. Man, I feel like I'm
not using my brain enough.
Speaker 5 (28:08):
How do we get so f y I go to
f y I dot A I Radio is out.
Speaker 1 (28:16):
Our personas are there.
Speaker 5 (28:17):
We're gonna build We built a uh by By investmentst
We built a persona for for the Earn your Leisure
team so they'll have their They're one of the first
folks to get their own voice for their station.
Speaker 1 (28:31):
We want to we.
Speaker 5 (28:31):
Would love to do like a breakfast club station. Love
to I have a we have a meeting with time now,
like hopefully this meeting goes well.
Speaker 1 (28:40):
Just putting it out there in the universe. That's so dope.
Speaker 3 (28:44):
Just think about if you a DJ and you're listening
and you're going to a market that you're not familiar with,
you could ask AI what's the hottest songs in that market,
and it'll give you a list of the hottest songs
in the market.
Speaker 6 (28:53):
Doesn't know like your personal opinions and stuff like so
like in like a Drake Kendrick, like can you make
it say things and do to that. The host doesn't
have to be the person that like, oh this opinion
is coming from this person. It's like, no, it was
my AI co host? Like is that a thing too?
Speaker 1 (29:08):
If you if we build a persona Forlorn, you could
make it. You can make your.
Speaker 5 (29:13):
Persona persona however creative you can have to have a stance.
We just don't support like I don't want to make
like hateful agents. I don't want to like distorted like
who wanted to beef.
Speaker 1 (29:27):
This one one?
Speaker 6 (29:28):
I want to cause you know, the host got to
get into stuff sometimes and sometimes people don't want to.
So I'm thinking, like people will lean on a box
so that you don't have to be the person that
is the face of what was said.
Speaker 7 (29:37):
You wouldn't do that.
Speaker 1 (29:40):
There's a there's a thin line that I don't want
to cross, Like yes, you can do that.
Speaker 5 (29:46):
We're at a place you could do you can have it,
do whatever you wanted to do.
Speaker 1 (29:50):
But we don't want to go we want to cross.
Speaker 2 (29:53):
Ai shot Jacks.
Speaker 7 (29:54):
You know, you know how people are gonna get with
stuff that's yes, it's just so interesting say yes, who
I am.
Speaker 2 (30:02):
I appreciate you for you will not be a billionaire
without us.
Speaker 5 (30:05):
If there's a way, If there's a way, like seriously,
like there's a way to like rock with you Charlemagne
and get your insights?
Speaker 1 (30:13):
Uh have you been a you know.
Speaker 5 (30:16):
Rock with us? Have steak on what we're doing. I'd
love to pay then, Upside will be honored.
Speaker 2 (30:20):
Talk about that on the radio. I'd love to I'm
gonna talk to you about that. Off there I am.
Speaker 1 (30:26):
It's the Breakfast Club. Good morning, Wake that ass up
in the morning.
Speaker 2 (30:29):
The Breakfast Club