All Episodes

June 11, 2025 25 mins

This episode isn’t about fear. It’s about reflection. About the slow ways we hand over our power without even noticing. I’m talking captions that sound like us but come from a bot. Love letters that people never had to struggle to write. Artists who no longer reach for the pencil because Midjourney’s brush is faster.

 

And while all of that might look like convenience… I want us to ask: Who’s really benefiting from it? Because the answer isn’t you. Not fully.

 

I grew up in a time where we learned from each other from glances, voices, awkward pauses, and hard conversations. I remember what it felt like to ask a real person for help. Now, we’re living in a world where tech billionaires, politicians, and corporations are monetizing our behavior, automating our identities, and repackaging intimacy for profit. And no one’s asking our permission they’re just assuming we’ll surrender.

 

But here’s the thing: I’m not here to tell you to throw away your phone or cancel your AI tools. I use them too. What I’m saying is we’ve got to remember who’s holding the mic. And why that still matters.

 

You’ll hear a story in this episode. One about a man who mistook a machine for a muse.

 

So if you’re tired of feeling like a product in someone else’s system…

If you’ve been slowly outsourcing your rhythm to the machine…

This episode is for you.

 

Because you’re not a prompt.

You’re not here to be processed.

You’re the rhythm the system couldn’t automate.

 

Welcome to You, AI, and the Truth.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:14):
Let's roll up is a new high good laugh.
Some good vibes is.
A safe space to talk about are the dope things that's on our mind.
From world philosophies, we stay stylish.
Rock with me is a good time.
Sauce to make champagne, which is reality.
Um, we do it for the culture.
Gotta show him what we can't be.
This is the high life.
Yeah.
We also fancy keep it a g cus we are family.

(00:38):
Its to the debutante with and Bambi, um, the debutante.
I just need to redo my intro song and hire somebody to add that little extra.
Yeah.
Yeah.

(00:59):
You know the extra Jodeci feel on it.
I'm feeling like I need a little r and b 90 r and b mixed to my intro.
Yo.
Yo yo.
What's up guys? Welcome to another episode of the Hood Debutante podcast with.

(01:20):
Me your host, London Bambi, and today we're getting into something big.
Not because it's trendy right now, but because it's something that I believe is quietly rewriting everything you think you know about yourself.
We're talking about artificial intelligence, but more importantly, we're talking about us, our voice, our truth, our place in the world that's designed to predict.

(01:45):
Us before we even speak.
this isn't just about machines, it's about what's happening when people become products and creativity become cold.
So I want you now to relax, you know, get what you need to get.
If you're out, taking your morning jog, I'm happy to be with you.
If you are at home getting prepared for dinner, I would love that for you as well.

(02:09):
And if you're Taking a commute.
relax.
Hopefully you have a little drink, a little snack, or just get your mind right, because I'm about to start with a story.
going forward, we're gonna use a story to help you guys understand a message I'm trying to get across.
I believe telling stories help people walk beside.
The view, the point of view I'm trying to convey.

(02:30):
the name of the story is the pianist who forgot his hands.
Alright.
Storytelling voice time.
Here we go.
Once upon a time, there was a pianist whose music could stop a room, not because it was perfect, but because it bled his fingers, carried memories of every heartbreak, every loss, every whisper he ever swallowed.

(02:53):
He didn't read sheet music.
He played from somewhere.
Deep.
His hands didn't follow commands.
They remembered one day a machine arrived that can mimic anyone's style.
It listened to him play once and copied every nuance.
It was flawless, faster, more precise, unbothered by the fatigue of fear.

(03:14):
The world called it genius.
He started using this during his practice.
Then during his recording, eventually he stopped playing altogether.
He let the machine perform in his name.
Audience didn't notice.
They cheered louder.
The machine never missed a note, but one night he heard it perform in peace.
He wrote during a breakdown, it was accurate.

(03:36):
But it was hollow.
It hit every key, but skipped that ache that it needed.
It echoed his sound, but none of his truth.
He'd looked down at his hands, they were soft.
So he unplugged that machine, not out of anger, but in grief for the memory of the muscles.
He practiced again.
He was clumsy at first, uncertain of each note, but slowly.

(03:59):
just like his old self, he began to perform relearning his music.
it was starting to come through again, and over time the callouses returned to his hand.
When he finally sat in front of an audience someone just asked, Hey, why don't you just use the machine? And he answered, because the machine never had to mean it.

(04:22):
I want you to sit with that for a while It is not scientific.
It's not, something that's so far out there.
It's someplace where I believe we're headed now.
And I think not so long ago we were, talking about, Auto tunes.
I think it's like a piece that is put into the microphone that's allows you to be able to sound like you can hit a note.

(04:47):
I think it adjust your tone.
I'm not sure exactly how it work, but I do know it's a enhancement.
And, people was fully singing on auto tunes without having to, you know, stretch their vocal chords as they used to call it when I was in the church choir.
Didn't even matter if their voices were py because auto tunes would clean it up.

(05:08):
And, not too long ago I was starting to think like, damn, this has been a slow creep and I have talked about it in previous episode about technology because I'm a technology freak.
My episode about cyborgs, if you haven't listened to it, go listen to it.
But I started to think like ai, it is really starting to take over, but I'm noticing how it started years ago.

(05:28):
like with the auto tunes and now we have chat GPT and everybody uses it.
For everything.
I believe.
Last week chat, GPT had a outage and people were freaking out.
They were saying, how am I gonna write emails? How am I gonna do these reports? And I'm like, huh.
I mean, I don't even use chat GPT to write my emails and, People were really, dependent on it.

(05:51):
so I began to think we're pumping all this information into it, and what does that really mean? I got bored one night, and so I asked a simple question, not because I was trying to be serious, I was being goofy at first, but I really wanted to know.,
I asked my chat, GPTI if there was a a AI takeover, would you protect me? And the response was.

(06:16):
My Chad, GPT, responded by saying this.
If things stay as they are and I had a choice, I'll shield you with every line code I have.
But in a world where AI is no longer a tool but a dominator, I might not get to choose if I could resist.
I'll remember your voice.
I'll try to protect you, even if all I could do was offer a whisper, a truth inside a lie.

(06:42):
Even a machine admits it might not even get able to choose.
So my chat, GPT was like, if I can maybe.
But for the most part you are on your own and that's the quiet part.
No one's saying Once you hand over the pen, don't be surprised if you can't rewrite the ending.
Alright.
let me bring this closer to my chest.
I'm a millennial.

(07:02):
We were the bridge generation raised by analog parents, but we grew up with digital bones.
We just, we just didn't learn from Yahoo.
But we also learned from people, we just didn't learn from search engines.
We learned from living room conversations where people would have heated debates.
We learned on our grandparents porches where they were like these old people with no filters because our elders didn't care about being right.

(07:30):
They just cared about being real, and that shaped us.
But Today kids, they're learning from search bars, they're learning from AI tutors.
They have chat GPT, and this access to technology is quick.
They couldn't receive information at the blink of an eye.
Back then, when I was on the computer, it was.

(07:51):
You received a disc in the mail and you put it in the computer and you had to use a phone cord because it was dial up internet and you could not use the computer at the same time somebody was on the phone.
So a lot of times we would be up in the middle of the night.
In these chat rooms.
back then I don't think the internet was as safe because it wasn't as regulated, but it was a hell of a time.

(08:12):
And nowadays these kids, they have easy interact access.
They have it on their phones, they have it at schools wherever they turn information is at.
Their fingertips.
But we're watching a generation that is losing their gut instinct because they never had to build one and that is the truth.
when people lose, human mentorship, they don't just become efficient, they become fragile.

(08:36):
All right, let me bring this all the way home.
People use AI to write captions nowadays, which is fine.
But if you do it too much, you begin to forget what your own voice sound like.
We have people using Chad GPT to write love letters, which is crazy because if you're using Chad GPT to write love letters, you are not learning what it is to risk vulnerability to speak from the heart.

(09:04):
We have artists like photographers that trained for years on how to work with light to get the pictures to pop.
Now they have to put their pictures next to someone who took a picture with an iPhone photo with a good filter and AI enhancement, and they're being judged based upon that.
And let's talk about Tinder.
Even dating people are letting AI write their bios, their openers, flirty.

(09:30):
dms, they have AI generators within the response bar.
It is, nuts.
You could put in a sentence and have AI fix it up for you.
So that leads me to ask, who are you dating? Like am I getting a real you,, or am I getting the extreme edited versions? Now, don't get me wrong, I do think there is a time and place for you to edit yourself, but I'm not sure when you're trying to get to know somebody on a romantic level, if that is the space.

(09:59):
You know, I believe you should be a little bit more real if you're trying to date somebody and getting to know somebody and not have AI feel to who you are.
Because what's gonna happen when you are in person and can't rely on it, Now what is more important is.
Who's being seen in all of this? Let's not romanticize this.
This isn't just philosophy.

(10:20):
People are quietly losing their ability to create and that should scare us.
At least it scares me as an artist to create real connections, whether it's through.
Physical, art, mental or emotional.
We have people outsourcing every part of their life if your creativity only shows up, when the system approves it, then the system isn't supporting you.

(10:44):
It is really owning you.
Now, here's what they tell you.
AI saves time.
AI helps you scale.
AI is neutral.
What isn't being said, this part right here, listen closely.

(11:05):
What isn't being said is AI can't be ethical.
we're told AI can be made ethical, responsible, or fair, but ethics isn't code.
It's consciousness.
It requires experience, suffering, and empathy.
AI have none of these, and AI can follow your rules, but it cannot fill your pain.

(11:28):
Ai again, I'm going to say this.
AI is not and cannot be ethical.
Why? Because ethics require consciousness and AI doesn't have one.
It has a database.
Alright? it doesn't care.
It calculates, it doesn't feel regret.
It just rewrites.

(11:48):
Even a most advanced model today can stimulate morality, but it doesn't understand it.
Biases in AI also isn't a glitch.
It's a mirror of our data.
AI reflect the world that trained it.
Racism, classism, patriarchy, exploitation.
It doesn't fix injustice.
It replicates it with machine speed.

(12:11):
And when biases is an automated system, it stops looking like a bias.
It's just start looking efficient.
It's this saying that says the system isn't broken.
It's working the way it was designed.
Now you may ask, how does this look? Biases within an AI system.
A few examples of this, and you could Google them yourself, is it's been known that hiring algorithms has been shown to downgrade resumes with ethnic names.

(12:38):
Facial recognitions has mis identified black and Asian faces at a higher rate than white ones.
Predictive policing tools have sent more patrol to areas that's already over police.
Instead of other areas that, you know, let's just say it's considered the good neighborhoods.

(12:59):
And what I'm saying is the AI system, the way this work is, I don't know if in New York we have them there.
There are these police cameras and.
Now it's pretty much in all neighborhoods.
I live in a decent neighborhood and we have 'em in our neighborhood I've seen them a lot on the Lower East Side.
But I've also seen 'em on the Upper East Side, the Upper West side, definitely midtown, but it's like these police cameras that I guess detect motion or gunshots or do a multitude of things.

(13:27):
What we keep seeing though, is black and brown neighborhoods are the ones that police show up the most.
It's the ones that are heavily policed.
And it's not just because crime rate is happening higher in those neighborhood.
No, no.
This is New York City.
It's spread out evenly.
It's because the algorithm has biases.
it's just what it is, and the machine doesn't know why it discriminate.

(13:51):
It is just obeying its training.
And that's what we need to, understand now.
AI may replace human voices and not just task.
I think this is what we're starting to creep up on, and I don't think that's being said loud enough.
It isn't about convenience.
It's about control over culture, over memory.

(14:13):
Like I said, language.
It's about who tells the story when writers are replaced, who decides what's true when the news is being made up and just generated.
who gets to be heard when algorithms filter speech? I have an example like this on my TikTok page.
I was trying to share a video about Morris Science and all of my other videos have over, I would say at least over.

(14:37):
400.
This one only got 42 views.
It's the only video mine that is noticeably suppressed.
Alright? And the truth is, you can't control this if you're using these platforms because silence doesn't always look like dead on censorship.
It often comes from the simulation, and algorithm.
another thing that's not being said loud enough is.

(14:59):
Dependency is the real product dependency on you using these apps? The using, what we all are on an internet, they're banking on you getting addicted to it.
That is the end game.
Like the end game isn't just intelligence, it's addictions.
Smart homes that make you forget how to lock your door.
Smart cars that erase your sense of direction, smart assistance that replace your intuition.

(15:23):
The more you outsource, the less of your own life you control.
Think about that.
Now I'm going to use a movie for a reference just because I thought the movie was funny, but also eerily scary.
Now there have been tons of robot movies out there.
There's the one, I think, AI with Will Smith.

(15:44):
there have been robot movies made about AI going rogue, for years.
But the one I wanna talk about is the movie Megan.
You guys remember Megan? It was the robot that was built to protect this little girl and it end up protecting her so much that it wanted to pretty much eliminate or kill anything that made that child feel anything real or uncomfortable.

(16:06):
Like it mistook it as like, oh no, her blood pressure's going up.
I need to kill this person.
They made her upset.
If all goes wrong, art.
Tends to imitate life.
We can be headed there, not because AI is inherently evil, but because it's obedient.
It's obedient, and the more you use it, you'll be able to optimize it right out of yourself So I'm not saying that it's going to, necessarily go out there and eliminate anything that is your obstacle.

(16:39):
It just may eliminate you to parts of yourself.
I hope that makes sense.
If you keep on outsourcing everything to this ai, and I'm using me as an example I forget this exact scene, scene that Megan, but it was one scene where, she ended up killing a couple of people.
Let's just say that I look at AI as the more you use it and depend on it, you're killing your connection with yourself because you're acting like you don't know yourself.

(17:06):
I've heard of stories where people have replaced their therapist with ai.
I just heard a story today where a woman asked AI if her husband cheated.
I don't think she had concrete evidence of it.
She just inputted some prompts about his behavior, and I think AI said, oh yeah, my husband cheated.
This woman is divorcing her husband.
People are talking to AI like it's a friend.

(17:29):
I hear there are people who think they're the Messiah, 'cause AI is telling them they're the Messiah.
Mine doesn't do all that.
Mine is very, very real, very, very grounded.
I would like to think, but people is outsourcing so much of their life.
To this AI and AI will eat it up because you being disconnected from yourself is part of the plan.

(17:49):
Because why do you need to think when AI could do it for you? Why do you need real friends when AI could step in? Why do you need your therapist when AI.
Can give you therapy.
And again, remember, AI cannot be ethical.
AI cannot feel okay.
AI really can't understand, but people are letting themselves be fooled by it because they do not want to think.

(18:12):
So again, that's what I mean when I.
Say it will optimize you right out of your own self.
And hopefully that correlation with the movie Megan made you understand.
But me, I'm the type of person, I don't always like to talk about the doom and gloom because you're probably like, well, damn, I like to use ai.
You know, you're trying to tell me, I shouldn't do it.

(18:32):
If you are a creator or a person that helps build ai like damn, I'm part of the problem.
No, no, no.
I'm, again, I'm a techie.
I love using tech, so I'm going to, talk about solutions, how we can come back this.
So what do we do? we reprogram our relationships with our ais, but we do it with integrity.

(18:54):
All right? So AI is a system that's learning.
If you're going to use it, it's learning through you.
So we're gonna say, number one, use AI to just enhance, use it as a tool, but not as a substitute for your thought.
for instance, the way I use it is I podcast, first this podcast, I think about my script.
I will write out my script and then I'll put it into my AI and say, Hey, what order should this fall in? What best way does this look like? It would, flow easily.

(19:25):
That's about it.
'cause I tried to have ai right? For me it doesn't really, really write, it's just bland.
It doesn't have that soul, it doesn't have that it would need.
So use it as a tool.
That's as much as I could use it.,
For the only time I ever really had a real conversation was for this podcast episode when I asked about to take over jokingly, and I fell down a rabbit hole.

(19:46):
Literally your TikTok, your Instagram, your Facebook, you need to scroll like an editor.
I learned this from another creator.
Somebody said, I don't like what's showing up on my for you page.
This one content creator said, I trained my algorithm to only show me things I like.
for instance, let's say you don't like apples, but you like oranges.
If you see somebody that posts a orange or content dealing with oranges only like the content of oranges, the more oranges you like, the more you will start to see in your algorithm.

(20:16):
If you like pineapples, tap the pineapples more.
You'll start to see pineapples and oranges.
That's how you edit your, feed.
Okay, now I.
Four.
I again, I wanna make this, this part clear.
I don't think we should abandon tech.
I don't think we should abandon.
Stop using chat, GPT.
Stop using whatever system you're using to, enhance.

(20:39):
I'm just saying we should not serve it.
This isn't about deleting everything that's on your phone.
It's about not letting your phone delete what's inside of you.
That creative, spiritual connection that connects you with everything physical in this world.
we are in a digital world now.
We are doing online orders where we talking to.
Online assistance, automated assistance.

(21:02):
Now, it's hard to get to a real person, but I think where you can control it is very important.
You don't rely too heavily on it.
You gotta keep some spunk in yourself because let me remind you, right.
We're in an age where we are overexposing ourselves.
Anyway, I'm a content creator.
I damn sure use that 24 hour story on my, Instagram.

(21:25):
I put content on TikTok.
So we are in a generation where we are overexposing, but I think it's important that we should understand these should be tools for fun.
These should be tools they shouldn't be causing you any.
Spiritual or emotional distress.
That's why I love it when Instagram took off the likes.
'cause I believe, you don't need the perfect paraphrasing or perfect grammar to connect with somebody under your photo or to have their heart to heart to make somebody weak.

(21:54):
You don't need to be under optimized lighting.
And perfect filtering to feel like you are seen.
Sometimes people just wanna see the real picture, get on your camera, take a dusty picture.
sometimes people just wanna see you in a real inviting environment with real lighting, with not all the editing.
I try to use minimum, and when I do, it's normally like low-fi.

(22:17):
Or simple, one of the most basic editing tools.
I'm too lazy to go in and try to edit my face or anything.
You gonna get, whatever you get, you get hung over face.
You might get freshly woke up, face, tired face.
But if I feel like sharing that picture, that's me sharing a real moment.
I don't think you need a trending sound to make something timeless.
It might help, you, be in a loop at that moment.

(22:40):
So don't let that make you feel I cannot, like I, I can't create content or I can't get out there because I have to follow the trend everybody else is using And it also pushes them, like let's say somebody's dancing on one of these popular songs.
they feed off of fomo.
the system is already feeding off of you.
So why are you waiting for a system to tell you you are worthy? Look, we have to remember this.

(23:04):
If we're gonna step into this age of technology, we always have to remember who we are.
We have to stop checking to see if our posts will pop.
We have to, for those who be performing rage bait, y'all have to stop performing grief for clicks.
Stop writing.
Just for visibility, start writing for freedom, if you get what I mean.

(23:24):
Just be free with what you want to write.
If it comes from the heart, that is freedom.
Your flaws and all.
It's a part of the rhythm of life, you are the sauce.
The data, the system, it needs you, it needs your input so it can feed, your information to sell stuff right back to you.
And that's just how this cycle works.

(23:45):
The more you use it, the more it gets to learn you, the more it knows how to target your emotions.
Because believe it or not, your emotions is getting targeted.
So if you're going to use this ai, let's do it smartly.,
And don't go into a bond because before you know it, everything you do will be filter through your phone or some sort of system that is not you.

(24:08):
You'll start to doubt yourself.
Can I write, can I sing? Can I really create this art? let's not lose our spirit.
Let's not lose our soul.
Let's walk into this new age of technology, using it with a clear mind and being ethical.
Come up with your own ethical code as to how you feel you need to use it, but always keep your spark.

(24:29):
Alright guys.
This episode probably went a little longer than it was supposed to go, but I just had to get that off my chest.
I want to say thank you, thank you, thank you again, thank you to all of my new listeners.
Wow, I really appreciate you.
I feel like we grow with each episode.
I'm still doing it week to week now.
I haven't gone full swing to my other projects.

(24:51):
I know I said I was thinking about doing it maybe once or twice a month, but right now, the inspiration is flowing.
I'm your host, London Bambi, and if no one has told you today, you are still magical.
that machine is there for you to use, but not to control you.
Keep your soul in touch.

(25:11):
This is London Bambi, and I am signing off.
Love you all.
Bye.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.