All Episodes

November 11, 2025 • 27 mins

Lizzie Eastham and Sam Rickard present Studio 1 - Vision Australia Radio’s weekly look at life from a low vision and blind point of view. 

On this week’s show 

“What NOT to do on the Internet” 

Friend of the show Shayne Allen, comes back to talk a little of his day job; and give some advice on what NOT to do when confronted with threats to your online security.  Lizzie and Sam also discuss some of their experiences and look at some of the more worrying aspects of Artificial Intelligence.   

Studio 1 welcomes any input from our listeners. If you have any experience or thoughts about issues covered in this episode or believe there is something we should be talking about.   

You may also be interested in joining our choir of angels and telling your story. 

EMAIL: studio1@visionaustralia.org or leave comment on the station’s Facebook page: https://www.facebook.com/VARadioNetwork 

This program was made possible with support from the Community Broadcasting Foundation. Find out more at https://cbf.org.au/ 

 

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
S1 (00:14):
This is Studio One with Sam Rickard and Lizzie Eastham
on Vision Australia Radio.

S2 (00:28):
Hello, I'm Lizzie and I'm Sam and you're listening to
studio one, Vision Australia radio's weekly look at life from
a blind and low vision point of view.

S3 (00:37):
This week we dive back into the subject of cyber
security and scams. We're joined by Shane Allen, who works
in the industry, as it were.

S2 (00:45):
But as we always say at this point, please do
get in touch with the show. Whether you have experience
of any of the issues covered on this week's episode
of Studio One, or if you think there's something we
should be talking about. You never know. Your story and
insight may help someone who's dealing with something similar.

S3 (01:00):
You can email us Studio One at org. That's studio
number one at org.

S2 (01:05):
Or of course, as usual, you can drop us a
note on the station's Facebook page by going to Facebook.com
Radio Network.

S3 (01:17):
Hello, Lizzie.

S2 (01:18):
Good afternoon, dear Sam. How are we today?

S3 (01:22):
I'm relaxed and comfortable because I. It's as part of
our 25th anniversary of being together. That is my lovely wife,
Heidi and myself. We had a spa and got all
the little horrible nooks and crooks and terrible things in
my shoulders and legs all ironed out. So yes, I'm

(01:42):
feeling relatively good, but still a bit sleepy for some reason.

S2 (01:45):
Well, I envy you. I mean, I'm also feeling the sleepy,
but with the added benefit of being stiff and sore so.

S3 (01:52):
Well, one can't have everything. But one thing that you
can't avoid, uh, nowadays does seem to be people trying
to steal money from you. Or information or information, or
probably both. Uh, how much experience have you had, uh,
on with the infamous, uh, calls or emails?

S2 (02:10):
Oh, a lot, but, you know, to be honest, I've
never had it eventuate into anything major. I do get
the odd call saying, hey, we know that you've been
that you've been involved in a car accident and that
you have taken out a claim. And I said, it's
very funny that because I'm completely blind and I don't drive.

(02:31):
And they hung up on me. Um, I've also had
somebody told me very recently of a PayPal scam where
they had and it was really well timed too, which
was crazy, but they had a problem with the seller
on eBay not delivering goods or not refunding or something
to that extent. And they got an email from PayPal,

(02:55):
except PayPal had two L's and services had a capital S,
and they said, uh, yeah, this is probably not real,
is it? And I said, no, probably not.

S3 (03:05):
Yeah, yeah. Well, herein lies the trick to it really,
is that sometimes they it seems like they know what
you're doing. But all that's happening really is they're just
sending out as many emails as possible. So they've got
your email from somewhere else, either a data breach or
something like that, and they just send out a big
group email and hope that 1 or 2 people buy it.

S2 (03:25):
Well, I would have to say it's probably got something
to do with the seller that they were engaging with
because they were a bit of a dodgy person, and
this person had to go to PayPal and get it
resolved to get their money back. So I would say
it's probably got something to do with that. And that's
the thing too. There are a lot of scammers on
eBay and marketplace too. And if you're a person that

(03:46):
uses Facebook but perhaps isn't familiar with online banking and
things like that, it can be absolutely crazy. The amount
of people that get caught out by these things.

S3 (03:57):
Something also that really amazes me as as an old
timer you might say, is there are scam ads all
throughout social media and on YouTube itself now.

S2 (04:09):
And they look so real, don't they?

S3 (04:11):
Yeah, they seem legitimate.

S2 (04:13):
They do.

S3 (04:13):
And it's listed that they're sponsored, but all they've really
done is paid some money to have their ads shown. Now,
just think if a reliable radio station like ourselves or
a TV channel or a newspaper did that, how much
trouble would they be in?

S2 (04:32):
Quite a lot. Especially if a lot of people got
caught out by it. And then there was backlash. It's crazy,
isn't it? Like when it comes to social media or YouTube.
I feel like as as someone that regularly engages with
and consumes a lot of that content, there isn't a
barrier of entry as to who can sponsor certain channels

(04:55):
or who can have ads come up on YouTube. Like
it doesn't take a lot to place an ad. No,
all you need to do is pay money for it,
put your commercial to them and have have it all
ready to go and they'll pretty much give you the. Yes.
There's not a big barrier of entry there to new
companies that want to come in on the market and advertise.

S3 (05:15):
Well, there's a couple of ads that I've seen on
YouTube recently that, um, kind of exemplify what I'm talking
about here. One is this miracle air conditioning system that
they have.

S2 (05:25):
Oh, I.

S3 (05:26):
Wonder.

S2 (05:26):
What. Yeah.

S3 (05:27):
Tiny little thing that yes, apparently is taking the whole.

S2 (05:30):
Somebody from Adelaide came up with. Oh yes. Yeah. I
you know what, I saw that. And I was like
that just can't be real.

S3 (05:39):
And indeed it isn't because yes, I did actually see
another thing on YouTube which pointed out that, no, it
wasn't real. It's just basically a very bad evaporative air
conditioning system that doesn't really cool things that well. So yes,
you will get sent something. So in that way, yes,
you've paid your money and they will send you something,
but what they will send you is a piece of, well,

(06:03):
stuff that comes out the other end of a dog.

S2 (06:05):
So sort of like Tamu or AliExpress?

S3 (06:09):
Well, yeah. Don't get me started on those ones. I
don't I don't even.

S2 (06:12):
I don't even get.

S3 (06:13):
Involved.

S2 (06:13):
With it.

S3 (06:15):
Anyway, let us talk to someone who is kind of
an expert because he works in the field. Some listeners may, um,
remember our conversation with Shane. Uh, well, it was middle
of last year. Andrew and Jodi at the same time, because, yes,
this is where I met Shane in the first place,
was travelling to Madrid for the iBSf World Games back

(06:37):
in 1998, and we've kind of kept in touch since. And, uh, yes,
he's he's well, well known as an athlete, but he's
like the rest of us, gone on to bigger and
better things. Over to you, Shane. Welcome back to Studio One.

S4 (07:02):
It's good to be back, mate. I'm happy to be
almost a regular. irregular.

S3 (07:08):
Um, I see you're in in the background here. You've
got a lovely zoom background, as it were. It's an
interesting zoom background. Um, how did you get that one?

S4 (07:15):
I honestly have no idea what zoom background I got on.

S3 (07:20):
As in none. Where are you?

S4 (07:21):
Uh, no, I'm just down at my local cafe. Um,
it's it's what I do. It's sort of my Friday
morning ritual, uh, work from home every Friday morning. So
I get out of the house, come down, grab a coffee,
do a bit of work, the occasional radio interview. Now, um, and, um. Yeah,
it's it's it's, you know, it's locals. I've got, uh,

(07:41):
both my dogs down here, my, um, my working guide dog,
Lottie and my retired guide dog, Bree. And we just
come down here and have a have a great time.

S3 (07:50):
So where is here?

S4 (07:51):
Here is its cafe called Fair Craven, and I'm on
the Central Coast in New South Wales at a place
called Point Clare. And it's, um. Yeah, God's country mate.

S3 (08:02):
Yeah. All right. So, um, when we were promoting our
last show on, um, scams and cyber security and everything, um,
wonderful like that. You drop me a note saying, hey,
come and talk to me because I actually work in
the field. So maybe you can start by telling us
what you actually do.

S4 (08:19):
Yeah. So what I do is I'm actually, uh, sales manager.
I manage a team of salespeople that sell fraud prevention software.
And so we have, um, our ours is a very
specific type of software that we sell that, uh, essentially
makes sure that when businesses are paying other businesses, uh,

(08:40):
that they're not falling victim to fraud. And I guess, uh,
where it fits into the cyber world is making sure
that we have a vast knowledge of, of different types of,
of ways people try to infiltrate your world and then
take advantage of you and, and profit essentially from it. So, yeah, that's, um,

(09:04):
what I do in a nutshell. But, uh, the company
I work for is very well respected in the, um,
in the cybersecurity world. And, um, and yeah, myself, I
manage people and make sure that they are well aware
of what's going on.

S3 (09:18):
And that's sort of the thing, isn't it? Is it?
There is a lot going on at the moment. Um,
even ten years ago we would have thought that. Yeah, saying, alright,
everyone's trying to hack your account would be a bit paranoid,
but pretty much everyone does seem to be trying something
on on nowadays.

S4 (09:35):
Well that's right. And they're particularly, you know, the last
12 to 24 months with the advances in AI technology.
You know, we've got people cloning voices, deep faking videos,
and we've got people using AI to manipulate and create
documents to that look, the genuine part, you know, that

(09:59):
that used to be a thing that that was not
a problem. But now we've got these these AI bots
and whatnot doing it and making it look absolutely genuine.
And it's it is really, really a big problem.

S3 (10:12):
I don't know about you, but I tend to be
able to spot when you say have an AI video
or an AI, um, voice. But I think that's kind
of giving us a false sense of security, because the
whole point is, is that scams work when we're at
our most vulnerable.

S4 (10:27):
Absolutely they do. And they also work on the most
vulnerable people as well. Uh, like people that are vision impaired.
I mean, you say you can spot a video, but
there are plenty of us out there in the vision
impaired world that wouldn't be able to. And it comes
down to the voice. I mean, there's a an absolute
plethora of of apps out there nowadays which are all
about changing voices, cloning voices. And they're, um, you know,

(10:52):
they're they're they're legitimate apps that you can go and
buy on the App Store. Uh, I think one's called
11 labs. That's kind of the most popular one at
the moment. And you can literally with only three seconds
of recording of someone's voice. You can then take that
recording and basically create a pretty accurate clone of that

(11:12):
person's voice.

S1 (11:19):
This is studio one with Lizzy and Sam on Vision
Australia Radio.

S3 (11:33):
Seems to be the little things that tend to sort of, um,
trip them up. So, um, I mean, one example, um,
I watched a video on YouTube and, uh, they said,
you know, you've got to live your best. Live. Yeah,
live your best life. Um, so there still is these
little errors here. But, I mean, what I'm saying is

(11:54):
maybe when they do get it right is when we
actually are going to be the most, um, in the
most trouble.

S4 (11:59):
Yeah. I mean, there's the and that is and you can,
you know, you can look out for those little nuances
and things like that and they will be there. But
there are also depends on how, how I guess diligent
the fraudster is or the cyber criminal is, uh, whether
they're going to just use the basic program or create
their own type of program that they say can do that,

(12:21):
or they go and use one of these professional programs
that that uses AI and does it accurately. There's again,
on Facebook, there's a there's a scientist that I follow
from time to time, Neil deGrasse, Neil deGrasse Tyson. And
there was a deepfake put out a few months ago
about him actually claiming that, you know, uh, that that

(12:44):
the world is flat and it fooled all his friends.

S3 (12:47):
Yeah.

S4 (12:48):
And when when he looked at it, he said, no, no, no,
I don't remember saying any of that. That's not me.
That's not right. And then when he looked at it,
he was able to pick up on the sort of
the the nuances in the rhythm of his voice and
the pauses, and he specifically puts pauses in his voice
to to emphasize and annotate things like that. So he
was able to pick up on that, but the people

(13:09):
around him didn't. And that was a deepfake video as well,
not just audio. So yeah, yeah, it can, but it
depends on how, you know, sophisticated the fraudsters want to
be and yeah.

S3 (13:21):
How much um, thereafter from you. I mean, the whole
point is I mean, what are they after? I mean,
are they after stealing our fortune? Are they after stealing
our identity? What's the point of all of these?

S4 (13:32):
Oh, they can't be after my fortune because there isn't one. No. Um. Uh,
it's it's it's a little thing. It's it's I mean,
every criminal has their own agenda and their own goal,
I guess whether it's power, whether it's money, whether it's, um,
you know, information that they can then sell on. Uh,
a lot of that is the case. There's, uh, you know, um,

(13:54):
identity theft is huge. The dark web is full of it. Uh,
there are so many tools out there that you can
just go and put your email in and to find
out if it has been part of a data breach
and if and it's actually quite scary how much is.
And really, it does probably come back to money at
the end, whether it's through sale of data, whether it's
through money redirection scams, um, or, or um, you know, ransomware,

(14:19):
things like that. There is so much that they are
trying to take, but it probably really just comes back
down to to money or power in the end.

S3 (14:26):
Um, so I'm going to ask you a question that
I asked the, um, one of my previous guests from
the ask, and that is, um, so I'm rather fond
of watching a few YouTube videos about, um, scambaiters basically. Um, um,
and the one thing I do know is having worked
in the IT industry myself, is that they know what

(14:47):
they're doing. They have virtual machines that are firewalled to
the nth degree. What would your suggestion be? For example,
if you decided, hey, I want to do a bit
of this, I this. I want to I want to, um,
take on the scammers myself.

S4 (15:00):
Yeah. Don't these people are professionals? These people are skilled.
They have tools that you cannot imagine are the things
that are available on, on the normal web, even on
the dark web, more so on the dark web. Um,
they have government backing, uh, from some some countries do

(15:21):
not engage with scammers, fraudsters, anything like that. Definitely. Uh,
steer clear.

S3 (15:27):
So if you see that email delete and uh, possibly
report it. If someone, someone calls you up, then get
out of that call.

S4 (15:36):
And there's an old adage, if it sounds too good
to be true, it is too good to be true.
You know, when you're talking about emails, it might come
from what looks to be a trusted email address, but
have a look at the actual email address. Have a
look at the URL attached to the email address. It
might look like something that you're familiar with, but there

(15:58):
could be an I instead of an L, or there
could be two L's where there's only meant to be one.
You know, there is. There is so many, um, tricks
that they use just to get past the, the, you know,
the layperson out there.

S3 (16:11):
And when in doubt, instead of actually clicking on the
link they give you, maybe try navigating directly to the company. Um, relevant?

S4 (16:20):
Correct. Exactly. Never never never. Click on links in emails.
One click can can download spyware, can download, uh, all
sorts of information in milliseconds. And once it's on your machine,
it can sit there for months and months without doing anything. Uh,
it can go through your networks. And the other thing is,

(16:44):
you know, you might say that you've got strong firewalls,
you've got strong virus protection, you've got everything, but you
don't know what your friends have. Mhm. You can't you
can't protect yourself from the, um, vulnerabilities of your friends
internet environments at home, your, you know, different, different workplaces,

(17:05):
free WiFi at, um, airports, all those types of things,
they're they're scammers. They're fraudsters. Uh, you know, that's that's
their way in. So just be very, very vigilant as
to what you're clicking on, how you're accessing the web.
And um, yeah, check double checking always if it's too
good to be true probably is.

S3 (17:26):
Probably is indeed. And I mean, you brought up, uh,
open Wi-Fi, for example. Um, so what should we not
be doing if we decide to jump on to an
open Wi-Fi system, for example, like at an airport?

S4 (17:39):
You should always if if you have to, I don't. So,
you know, it's not something that I really ever keep
at the forefront of my mind, but only access familiar websites, um,
legitimate websites. Look for the little lock, uh, in the, um,
in the address bar that that shows that it is
a secure website. Yeah. Just don't. Yeah. I mean, phones

(18:01):
phones have had that much data anymore. I don't really
know why people need open Wi-Fi.

S3 (18:07):
Indeed. Um, that's, uh.

S4 (18:09):
And and it's usually even, uh, faster than the than
the free Wi-Fi. Open Wi-Fi around. So.

S3 (18:15):
And from a blind person point of view, there's a
lot more rigmarole trying to log into the damn things.
And you don't know what you're what you're agreeing to.
Even for the, uh, actual, legitimate side of things. Uh,
so you might be getting these, uh, unexpected emails that, uh,
trying to sell you stuff, and it's like, okay, where
did where did that come from?

S4 (18:31):
Indeed, indeed. Um, and then you've got to go through
those bloody I'm not a robot challenges and. Yeah.

S3 (18:37):
Oh that's something we could go go through as well. Because,
I mean, there is an ongoing case, um, both nationally
and internationally against companies that, uh, still do that. And, um,
there surely there are alternatives. Now, don't tell me that
your company is, one of those that puts those that
sort of security feature on a website? Surely not.

S4 (18:57):
No, we do not. And, um, and I was actually
interested to find out some time ago, but it really
is just, um, the way that they actually prove that
you are not a robot isn't through clicking on the
right pictures or whatnot. It's the way your mouse tracks
to click on those pictures, because a robot will have
direct lines to the pictures or, or very, you know,

(19:22):
continuous flowing motion where they are, where, where it it
tracks the mouse movement. If it's if it's sort of randomized,
then that's how it tells you you're a human. And
clicking on the pictures with the ladders in it or
the bus in it or whatnot, is actually just training
the AI to be able to recognize those things in images.

S3 (19:38):
And we're back to AI again. Anyway, thank you so
much for joining us. Um, have a good one, Sam.

S4 (19:46):
Always happy to be here, mate.

S2 (19:54):
Very insightful interview, Sam. I have to say, you guys
covered a lot of really interesting points, but one thing
that stood out to me was the use of AI.
And I think that whilst it can be innovative and
whilst it can lead to some great advancements, there is
a major increase in the distribution of misinformation or disinformation

(20:18):
because like I've seen on well, I don't use TikTok,
but my husband does, and there's AI videos of, say,
Joe Biden saying something completely crazy and it's not even
like a real thing. It's an AI. So it's like
if you don't agree with the, you know, the people
on the other end of the political spectrum, just just

(20:38):
get the AI out and fix something up and off
you go. Then you can start spreading disinformation, misinformation and
basically a smear campaign. It's it's absolutely crazy. II is
a two edged sword, I believe.

S3 (20:53):
I don't know. I can still usually pick it up though.
Though when I see an AI or hear an AI voice,
there's just something subtle at the moment that gives it away. Um,
I call it the Governor Tarkin effect. If anyone who's
ever seen Rogue One A Star Wars Story, you have
Peter Cushing playing, um, Governor Tarkin. And of course, Peter

(21:15):
Cushing has been dead since the 1980s. But you can
tell it's a rather spooky AI. They just haven't quite
got it right yet. And that's the vibe I tend
to get when I'm hearing these ads and these deep,
deep fakes and stuff like that. There's something just at
the back of my neck that sort of makes me think, hmm, no,

(21:35):
there's something quite not right there.

S2 (21:37):
But, Sam, I would dare say that, you know, you're
a pretty cluey person. And you can if you're an. Okay,
how do I put this? If, for example, you are
on one side of the political spectrum and there happens
to be a heated debate going on, and you see
a clip from the other side of the political spectrum

(21:58):
that engages with you and arouses emotions and arouses particularly
strong emotional response. That part of your brain might not
necessarily work. So I'm not saying you in particular, but
there are a lot of people that will see it
and go, oh my gosh, I can't believe this person
actually said that without actually taking into account all of

(22:21):
the things that give away the fact that it's an
AI because it's a it's eliciting a strong emotional response,
which is exactly what these people want. Like, Steven's been
tripped up by it, and it's not until I've listened
to it I've gone, yeah, that person doesn't talk like that.
That's definitely an AI.

S3 (22:39):
Um.

S2 (22:40):
But you know what I mean. Like, it's it's it's
crazy because these people, usually when they release these videos,
it's in the midst of a, you know, a big
debate that's happened or a hot topic issue, whatever. And
it's meant to elicit a strong emotional response. And in
doing that, people don't use the logical parts of their
brains really, to discern whether it's correct or whether it's

(23:04):
a fake.

S3 (23:05):
One recent case, in fact, was I mean, we have
the story of the child going missing out on his
grandparents farm. That's pretty much out in the middle of nowhere.
And someone decided to, um, do a deepfake of the
same child being abducted, basically.

S2 (23:22):
Which, wow.

S3 (23:23):
That it's not just bad taste, but it's fake news.

S2 (23:27):
Um, you're misleading people too.

S3 (23:30):
Exactly.

S2 (23:31):
Like, if you know you're misleading people, the child could
come to harm from natural causes. You know, just the
fact that he's gone out missing and in the wild. Like,
you don't need to be adding more fuel to the fire,
but that's what these people are trying to do. Add
more fuel to the fire because it gets them, likes it,
gets them attention, it gets them. Some people are sick, man.

(23:53):
What can I say?

S3 (23:54):
Well, um, it comes down to a saying again, I'm
a bit of a sci fi nerd, but it comes
from a saying from Jurassic Park, which, uh, Goldberg had
said the scientists worked out what they could do and
didn't stop to ask should they do it? And that's
really what it comes down to in this, in this big,
bad world of ours now, is just because something can
be done. Let's just take a step back and ask

(24:17):
whether we should. It's like, for example, us talking about
a hot button story like we were just talking about.
We kept it nice and vague.

S2 (24:25):
Yes, yes. Well, that's the thing. I think humans, when
left up to their own devices, are extremely stupid and,
you know, often their worst enemies, their own worst enemies. Really? Well,
we're getting off topic, though, aren't we?

S3 (24:39):
We are getting off topic, unfortunately. Uh, anyway, that is
a wrap for this week. A big thank you to
Shane for his, uh, rather interesting view on the world
and sharing his lovely cafe background with us.

S2 (24:50):
And of course, thanks to you for listening. That includes
our listeners on the Reading Radio Network. You can find
a podcast for this program, plus some extra content on Apple, Spotify,
Google or your favorite podcast platform.

S3 (25:05):
Next week, have you had any experiences on public transport?
I'd say you have. Have you had any interesting experiences
on public transport? Definitely. So we want to hear about
what you've got up to on planes, trains and automobiles.

S2 (25:18):
But between now and then, please do get in touch
with the show. Whether you have experience of any of
the issues covered on this week's episode of Studio One,
or if you think there's something we should be talking about,
you never know. Your story and insight may help someone
who's dealing with something similar.

S3 (25:34):
You can contact us via Email Studio one at org,
that's studio number one at org.

S2 (25:40):
Or of course, you can find us on Facebook or
Instagram by searching for VA Radio Network. We want to
hear from you.

S3 (25:48):
Bye for now. Oh, I've just got a notification here. Um,
I can now get Grange Hermitage for $20 a bottle.
I think I'm going to jump on that one. Oh,
what was that? What did you just hear?

S2 (26:07):
Oh, um, Sam, I might not be working with you
for much longer. You see, I've been offered a new job.
All I have to do is work 16, 90 minutes
a day remotely, and. Oh, but wait a minute. They've
sent it in a group text to me in about
two other numbers. Mm.

S3 (26:21):
Sounds a bit dodgy. Anyway, I don't know if.

S2 (26:23):
I'll quit my day job.

S3 (26:24):
You can share my bottles of wine anyway.

S2 (26:25):
Oh, thank you, I appreciate it.

S1 (26:27):
Studio one was produced in the Adelaide studios of Vision
Australia Radio. This show was made possible with the help
of the Community Broadcasting Foundation. Find out more at.
Advertise With Us

Popular Podcasts

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

The Brothers Ortiz

The Brothers Ortiz

The Brothers Ortiz is the story of two brothers–both successful, but in very different ways. Gabe Ortiz becomes a third-highest ranking officer in all of Texas while his younger brother Larry climbs the ranks in Puro Tango Blast, a notorious Texas Prison gang. Gabe doesn’t know all the details of his brother’s nefarious dealings, and he’s made a point not to ask, to protect their relationship. But when Larry is murdered during a home invasion in a rented beach house, Gabe has no choice but to look into what happened that night. To solve Larry’s murder, Gabe, and the whole Ortiz family, must ask each other tough questions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.