All Episodes

October 13, 2024 9 mins
Today : E043-2024 Cyberium Podcast - Positive identification and biometric condition by Antonio ROSSI
 

ENGLISH PODCASTS  : https://technocratico.it/cyberium-podcast/

In each episode, we dive into articles published on technocratico.it by Raffaele Di Marzio or explore 
his reflections brought to life through AI analysis and techniques, powered by Gemini Pro, which present in-depth discussions in English, explaining the topics in a simple and concise manner. Our mission is to reveal, in a straightforward yet precise way, how technology influences every aspect of our personal and professional lives. Whether you're a tech industry professional seeking expert insights or a curious listener wanting to understand how digital security impacts your daily life, Cyberium is your gateway to comprehending the holistic influence of technology, offering a unique perspective thanks to the integration of cutting-edge AI analysis. 

Today, we'll be exploring a cutting-edge topic in the Positive identification and biometric condition


To guide us through this analysis, we'll be relying on the insights of Antonio Rossi

Tune in to gain valuable perspectives and stay ahead in the rapidly evolving tech landscape. 

All reproductions rights are reserved by Cyberium Media Miami Productions and Technocratico.it

Content Creator Direction : Raffaele DI MARZIO https://www.linkedin.com/in/raffaeledimarzio/

Content Creator : Antonio Rossi https://www.linkedin.com/in/antoniorossi/

For inquiries, you can reach us at podcast@cyberium.media
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Welcome to Siberian. Here, technology and cybersecurity are made simple
for everyone. Whether you're a tech geek or just curious
about the digital world, we've got you covered. Each episode
we dive into the latest topics from technocratico dot it
and break them down so you can stay informed and protected.

Speaker 2 (00:23):
This is a.

Speaker 1 (00:23):
Siberian Media Miami production. Let's get into it.

Speaker 2 (00:28):
The scaber gulsis flame. Make us a blood back fave
to love it to fuckings holess, see usselves and remember this.

Speaker 3 (00:48):
In this episode, we analyze another topic proposed by Antonio Rossi.
Antonio is a senior expert and frequently speaks at various
conferences focusing on information and communication, techechnology, and blockchain environments.
His experience today guides us in exploring positive identification and
biometric condition, offering a unique perspective. Let's begin this stimulating discussion.

Speaker 4 (01:12):
But hey everyone and welcome. We're taking a deep dive
today into something I know a lot of you have
been asking about biometrics, and get this, It's way more
than just fingerprints and face scans.

Speaker 3 (01:26):
It really is.

Speaker 2 (01:26):
We're talking about information our bodies give off and what
that means for all of us, especially when it comes
to things like the privacy.

Speaker 4 (01:32):
Okay, so I feel like I'm already using biometrics every day,
even if I don't realize it. I mean, I unlock
my phone with my face, I use my fingerprint for
banking apps. But this deep dive, I've got to say,
it's got me thinking what else is out there? The
source mentioned things like gate analysis and retinal scans. I'm
not even sure what those are.

Speaker 2 (01:50):
It's true a lot of people don't realize how integrated
biometrics already are in our lives. So gate analysis that's
actually looking at the way you walk, your own specific
way of moving, and then retinal scans those use the
patterns in your eye to verify who you are. It
might sound like something out of a movie, but it's
already being.

Speaker 4 (02:07):
Used wild So how does any of that even work?
You mentioned enrollment and matching before. What's that all about?

Speaker 2 (02:13):
Ah, you're picking up on the key stuff. Enrollment in matching,
that's really the core of how biometrics works. It's a
two step process. So first, your unique biometric information, like it,
let's say your fingerprint, it's scanned in store. That's the enrollment.
Then when you go use that fingerprint, like to unlock
your phone, the system compares that new scan to the

(02:33):
one they store. If they match, boom, you're in.

Speaker 4 (02:36):
So my phone learns what makes my fingerprint unique and
then checks any future scans against that.

Speaker 2 (02:41):
Got it. But back to gata analysis and retinal scans
for a second. Those seem way more I don't know, intense.
Can someone really tell who I am is by how
I walk?

Speaker 4 (02:51):
They can? And that's where both the promise and the
concern with biometrics come into play. But before we get
into the possible downsides, let's look at some of the
ways this tech is actually doing some good.

Speaker 2 (03:02):
All right, So the source mentioned some really interesting cases,
especially in healthcare like Improvada. They're using biometrics to make
sure the right patient is getting the right treatment, no
mixups or errors, which, honestly, in a hospital setting that
seems like a huge win. Huge. It has the potential
to completely change how we approach healthcare by making sure
patients are correctly identified every single time. It cuts down

(03:26):
on potentially dangerous mistakes, and it can really streamline how
people get the care they need.

Speaker 4 (03:31):
It really makes you think about all the ways biometrics
could be used to improve things. But the source also
talked about using it to stop frauds, So it's not
all about hospitals.

Speaker 2 (03:40):
Right, absolutely, think of it like situations where the typical
security stuff just isn't enough. Passwords, people forget them, they
get stolen. With biometrics, you are the key. Literally, that's
an extra layer of protection, whether it's for your bank
account or any kind of sensitive data.

Speaker 4 (03:55):
Okay, yeah, that makes sense. But then the source also
brought up some worries with the whole thing, especially about privacy.
They even use the term like PII vulnerability, but I'll
be honest, I'm not one hundred percent sure of what
that even means.

Speaker 2 (04:08):
That's a great point, and it's important to understand, right
PII that's short for personally identifiable information, So it's any
data that could be used to say, hey, this is
you specifically, And the vulnerability part comes in because well,
you can change your password, but you can't really change
your fingerprints, can you.

Speaker 4 (04:25):
Okay, yeah, that's a little freaky when you put it
like that. So someone gets my biometric data, it's not
like I can just go get a new eye scan, huh.

Speaker 2 (04:33):
And that's exactly why if this stuff is misused. It's
a big deal. And source made a point that really
stood out. Even if they make biometric data anonymous, you know,
take your name off it, it could still be traced
back for you.

Speaker 4 (04:45):
Hold on, really, I thought if the anonymised data it
was like gone, you know, untraceable, How is that even possible?

Speaker 2 (04:50):
It's tricky the picture this. There's a bunch of health
info out there, right, and it doesn't have your name
on it, but maybe it's got your heart rate variability
or something. Now imagine someone else has your data from
your fitbit.

Speaker 4 (05:03):
Okay, I see where you're going with this. My fitbit's
always tracking that kind of stuff. Someone could put two
and two together, right, Even.

Speaker 2 (05:09):
If the info seems tolly random, it can be pieced
back together. And when it's your body we're talking about,
that's scary stuff.

Speaker 4 (05:16):
That's where it gets even wilder because the source goes
into this whole nother level of hidden data that biometrics
can show, like about our health and everything. I have
no idea your fingerprints a link to like chromosomal disorders,
down syndrome.

Speaker 2 (05:28):
Even it's true, and it's pretty new territory. It's not
just that they id you it's that our bodies they
kind of give off clues about our health, even our genes,
Like some fingerprint patterns might mean someone's more likely to
have certain conditions.

Speaker 4 (05:42):
So on one hand, that's amazing early diagnosis, maybe even
new treatments, But then what if that's used against people,
like you're denied something based on what your body might
develop later on one.

Speaker 2 (05:55):
Hundred percent, And that's what we've got to be so
careful about as this tech gets more advanced. The source
gave some examples that are honestly kind of unsettling, like what.

Speaker 4 (06:03):
If employers start using this in hiring. Imagine being told, sorry,
your data analysis suggests the chance of X disease down
the line, no job for you, and maybe you're perfectly healthy.

Speaker 2 (06:12):
Right now exactly. It's like they're predicting the future and
then judging you based on them. And it's not just jobs.
Think about insurance companies. What if they jack up your
rates because your retinal scam suggests a higher risk of something,
privacy discrimination, all of it's on the line.

Speaker 4 (06:29):
It's like we're walking talking data sources and most of
us haven't even begun to wrap our heads around it.
I'll be honest, it can feel like a lot, you know,
like we're kind of standing on the edge of something
massive and it's exciting but also a little creepy all
at once.

Speaker 2 (06:43):
Oh. Absolutely, it's big stuff we're talking about. But here's
the thing to remember. The more we understand about this
tech where it's going, the better we can all deal
with it. Right, Knowledge is power and all that.

Speaker 4 (06:53):
Okay, that makes me feel a bit better, So it's
not all doom and gloom. Then the source mentioned solutions. Yeah,
what can people actually do about any of this?

Speaker 2 (07:02):
Totally, it's not hopeless, not at all. There are things
we can do, both each of us individually and then
as a whole society to make sure this goes in
a good direction.

Speaker 4 (07:11):
Okay, I'm listening. Give me some hope here.

Speaker 2 (07:14):
First off, the most basic thing, be informed before you
share your data with any app, service, whatever. Do a
little digging, find out what biometrics they're taking, why they
even need it, and then how they're going to use it.

Speaker 4 (07:28):
So like reading the fine print before you sign on
the dotted line, making sure you know what you're getting
into exactly.

Speaker 2 (07:34):
Don't just click agree without thinking about it. A few minutes.
Checking could save you a lot of trouble later on.
Makes sense to me.

Speaker 4 (07:41):
What else we got to push for better laws about
all this, Stronger rules to protect our privacy. Support the
folks out there who are trying to make companies responsible
for how they use our biometric data. So it's about
making our voices heard as consumers, as voters, the whole
nine yards.

Speaker 2 (07:55):
Yep, and this is cool. This source also talked about
supporting companies that are making tech to protect privacy, so
like ways to use biometrics but without storing all our
raw data everywhere.

Speaker 4 (08:08):
So instead of my actual fingerprint being out there in
some database, it's more like they use it to make
a key, but that key only exists on my phone
or something.

Speaker 2 (08:16):
You got it. It's called on device processing, and it's
way more secure. By supporting those companies, we're telling everyone, hey,
we can have security and privacy. We're not giving up
on that.

Speaker 4 (08:25):
It's good to know there are folks working on the
good guy side of tech, that's for sure. This whole deep.

Speaker 2 (08:30):
Dive has been eye opening.

Speaker 4 (08:31):
I gotta say, well, learned a ton, but it's clear
this is only the beginning, right.

Speaker 2 (08:35):
Totally agree. We're just starting to figure out what biometrics
really means in the long run. But staying informed, fighting
for our privacy, supporting the right companies, that's how we
get a say in how it all plays out.

Speaker 4 (08:47):
You've definitely given me a lot to think about, that's
for sure. Like if our bodies are becoming data themselves,
who really gets to decide how that data is used?
That's a question I think we all got to be
asking ourselves. And on that note, we'll you all to
ponder that one. Keep digging deeper, folks, This is just
the start of the conversation substraten alongs full results.

Speaker 2 (09:10):
Who are you going to call? All reproduction rights are
reserved by Siberian Media, Miami Production and Technocratico dot it.
For inquiries, you can reach us at podcast at Siberium
dot media.
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.