All Episodes

August 12, 2025 5 mins

What happens when AI-generated videos go viral — and no one knows what’s real anymore? This episode dives into the shocking (and disturbing) story of a whale trainer allegedly attacked by an orca, only to reveal it’s a completely fabricated AI creation. The Jubal Show unpack the growing challenges of deepfakes, online misinformation, and the urgent need for AI regulations. Plus, hear eye-opening facts about wildlife decline that will leave you questioning the future of our planet.


Nina's What's Trending is your daily dose of the hottest headlines, viral moments, and must-know stories from The Jubal Show! From celebrity gossip and pop culture buzz to breaking news and weird internet trends, Nina’s got you covered with everything trending right now. She delivers it with wit, energy, and a touch of humor. Stay in the know and never miss a beat—because if it’s trending, Nina’s talking about it!


This is just a tiny piece of The Jubal Show. You can find every podcast we have, including the full show every weekday right here…

➡︎ https://thejubalshow.com/podcasts


The Jubal Show is everywhere, and also these places:

Support the show: https://the-jubal-show.beehiiv.com/subscribe

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
It's time for Nina. What's trending? Brought to my mom?
Shooting in your home for machine gun?

Speaker 2 (00:06):
You don't know appropriate it? That was right now, because
that's how I feel. This story that I just found
has been going viral in that.

Speaker 3 (00:19):
Is the reaction.

Speaker 1 (00:20):
I don't even know what just happened.

Speaker 3 (00:22):
I'm trying to process.

Speaker 1 (00:24):
Just came out. Sorry about That's.

Speaker 4 (00:26):
How I feel.

Speaker 2 (00:27):
Okay, have you guys seen the video of this woman
who is allegedly a whale trainer get have a confrontation
with the whale inside of the thing?

Speaker 3 (00:36):
Okay, inside of the thing.

Speaker 2 (00:39):
Well, she was basically training the whale and then in
the video it shows her not making it out of
the mouth kind of Yeah, the whale killed her in
the CDEO, Oh.

Speaker 3 (00:50):
I was trying to dance around it. You weren't getting here.
That was terrible.

Speaker 1 (00:55):
I thought like her and the whale were like, you know,
like a john at each other. You know.

Speaker 2 (00:59):
The reason why that's serious. The video shows it to
be that serious. But the reason why it's so disturbing,
on top of the fact that the image is already gross,
is that it's AI.

Speaker 1 (01:08):
So this is not real.

Speaker 2 (01:09):
No, this has been Jessica Radcliff is not a real person,
so allegedly this whole story was talking about this twenty
three year old whale trainer named Jessica Radcliff showing this whole.

Speaker 3 (01:21):
Video of the of the.

Speaker 2 (01:22):
Orca in her and in this space, and people watching
like if it was like in a sea world type
of a situation.

Speaker 3 (01:27):
Well, that happened, it did.

Speaker 2 (01:29):
It did happen twice. It's been documented that it has
happened in the past, but nothing has happened currently, and
there's a lot of feels about it. But this is
just blowing my mind because it's like, you have this
disgusting video. Yeah, that's completely taken over the internet.

Speaker 1 (01:44):
Wow.

Speaker 2 (01:45):
And then at that moment people were like, well should
we see if this is ai were real? Like shouldn't
that Like you have to question these things.

Speaker 1 (01:51):
It's hard to know what is real and what isn't
real online anymore and anything.

Speaker 2 (01:55):
No, Yeah, And then they leave the videos up though
because they're getting so many views, but you need to
have the disclaimer there, and so now there's an even
bigger conversation like should this stuff even be allowed and
how do you how do you like.

Speaker 4 (02:08):
There should be suck it like eire has to be
a disclaimer kind of like for like if you're posting
like an AD, you have to put at the bottom
like this is an AD or whatever, this is a
this is AI. Like there needs to be I think
a disclaimed I would have even I would not have
even thought to check is mean either.

Speaker 3 (02:23):
And I only saw it because it was like AI.

Speaker 2 (02:25):
And then I went backwards and got all caught up
and I was like, this is so disturbing. But there
has to be also some type of technology where it's like,
if you upload something to any platform, it should be
able to flag it, right if it's AI.

Speaker 3 (02:36):
Is there a way to know that?

Speaker 1 (02:37):
I don't know if they've created that, but I'm sure
they can create it. They can make anything with AI.
Actually yeah, hey hotel on itself, I guess, but they
should have stuff like that. That's why all the people
who have started AI and some of the founding people
who created it have like stepped away and said they
need to do something about regulating this. If not, we're
gonna have a real problem. And they still haven't really
done anything to regulate it.

Speaker 2 (02:58):
And so now we're just like seeingdeos like this and
it's just not thankfully, it's not real, thankfully, but still
so disturbing.

Speaker 4 (03:06):
This gives me no hope for our future though we
literally have made so many movies we're AI and robots
take over and like, he.

Speaker 3 (03:15):
Really take us out? And what do we do after
we see the movie? Nothing?

Speaker 1 (03:19):
And you know how those movies usually start with somebody
who's an expert in the field going, this has to
be controlled or something bad is gonna happen, and they
don't listen to them.

Speaker 3 (03:27):
And then meanwhile, we all sound like jupil Now that's
literally what's happening. But we're doing nothing.

Speaker 1 (03:36):
I know, it's crazy.

Speaker 4 (03:38):
Hey, why like we're gonna get take it out, We're
gonna be gone, We're.

Speaker 3 (03:43):
Not gonna make it next year. We don't get this control.
It's so frustrated.

Speaker 2 (03:48):
It is.

Speaker 1 (03:49):
It's a trip that people are not listening to people
who are the experts. Ah. I mean even Elon musk
Well said it needs to be yeah, and he's he's
you know, was in on some of the companies that
started it, and he stepped away from those companies because
he's like, this is going to get out of control.
And if somebody doesn't do something about this. It's going
to be really bad.

Speaker 3 (04:05):
Yeah right, it could be so cool, but here we
are Nope.

Speaker 1 (04:10):
Not to alarm people too. But I also found out
the other day, and I don't know why this isn't
a bigger deal, but did you know that seventy percent
of the wildlife on the planet has gone extinct? What
over time? Yeah, so based on what it used to
be like they're saying, like even areas where you drive
through and have a lot of insects and with your
windshield and stuff like, you can drive through them. Now
no insects at your windshield. So that's a lot alarm

(04:31):
You'm sign is when the insects aren't flourishing.

Speaker 2 (04:33):
Yeah, that is interesting. My mom said she noticed there's
something weird going on with the bees. They're circling puddles
now and that's not normal. My wits will cleaner, it
is nice, So that does not though, I don't know,
but as long as they're running into my car like
they're feeding the robots insect Oh god, if anybody else
has any alarming information about our future, that can really
freak out us and our listeners right now texted in

(04:54):
and four one oh six one I'll share.

Speaker 1 (04:55):
It with them.

Speaker 3 (04:56):
Please let us know. Thank you.

Speaker 2 (04:58):
Awareness is key, that's what's trending.

Speaker 1 (05:01):
She's got to learn to go with the flow man,
you know. No, hey, whatever happens to happen.

Speaker 3 (05:04):
No, yeah, I'm on that train right now.

Speaker 1 (05:06):
Rot the blinders on, baby grad for a ride.

Speaker 3 (05:10):
No, I don't want to go for the ride. I
want to get off the ride. Someone let me off.
Who's candy I just bought? Said, Life's a journey man?
What is that fortune cookie?

Speaker 4 (05:17):
No?

Speaker 3 (05:18):
Just it was a candy. Oh it was a gummy.

Speaker 1 (05:22):
So frustrating.
Advertise With Us

Popular Podcasts

New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

The Breakfast Club

The Breakfast Club

The World's Most Dangerous Morning Show, The Breakfast Club, With DJ Envy, Jess Hilarious, And Charlamagne Tha God!

Fudd Around And Find Out

Fudd Around And Find Out

UConn basketball star Azzi Fudd brings her championship swag to iHeart Women’s Sports with Fudd Around and Find Out, a weekly podcast that takes fans along for the ride as Azzi spends her final year of college trying to reclaim the National Championship and prepare to be a first round WNBA draft pick. Ever wonder what it’s like to be a world-class athlete in the public spotlight while still managing schoolwork, friendships and family time? It’s time to Fudd Around and Find Out!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.