Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Now here's a highlight from Coast to Coast AM on iHeartRadio.
Speaker 2 (00:04):
All right, So justin Harrison, the CEO, and so I
look at this and I said, it's one of those
it's like one of those acronyms. When I look at it,
I want to say you because I see the V
and it looks like the Latin you, right, and so
I go, like, you've, it looks like but it's it's you've.
Speaker 3 (00:26):
Is that our you've? You call it you've? Okay, all right,
so you only virtual?
Speaker 1 (00:32):
You got it? Okay? Good.
Speaker 2 (00:33):
So, so you were just saying though that here you were,
you were coming out of surgery after having you know,
been in this wreck. Your father comes down, mister engineer,
mister linear thinker, and he is talking to you, but
you're not making any sense because you're still you know,
coming down off the painkillers and whatever.
Speaker 3 (00:53):
So you've you felt like you were speaking gibberish. So
what happened?
Speaker 4 (00:59):
Well, you know, so he's going back and forth with it,
and he's probably me story obviously, you know, and retroactively,
and and you know, he says to me, I had
this huge sense of relief because you were getting so
annoyed with me, and I go, you know, and I'm like,
why would you believe that I'm getting annoyed? And he goes, yeah,
(01:21):
you know, you're you're exhaling hard, you're rolling your eyes,
and you were still there.
Speaker 3 (01:27):
Yeah.
Speaker 4 (01:28):
I knew that your head was okay. I knew that
my kid was still there. I knew you were still you.
And that for me, when he said that to me,
was a light bulb to me, which was I'm not
trying to save the information about my mom. I'm trying
to save what I know is her, what I know
her to be. And it opened up this whole thought
(01:49):
process about the fact that you know, my mom, as
we all are, was many different people to many different people.
You know, you, you're a personality, he isn't this sort
of universal truth right here were thousands of personalities, really,
and in that personality that is he in or is
(02:09):
justin is rightly just an aggregation of tons of smaller
personalities and really truly made me realize the biggest tragedy
of death is that for the survivors, this personality of
mine that came out from my mom for forty years,
by the time she died, it was just going to
go away. It was just going to be stuck there.
(02:31):
I mean, it doesn't go away. I'm alive. I still
that part of me is still there that wants to
connect with her, but there's nobody for me to pull
it out for because I only had one mom and
that's when we really it clicked for us. So we
don't actually build a virtual personality. We call them villas.
We build dozens depending on who wants to remember the
(02:53):
person who wants to connect with that person, because it's
going to be a different relationship. And I think you
had started this question asking how we do it, and
it's really looking when you think about everything that a
relationship is based on its communication, Right, how do you
communicate with your friends, How do you communicate with your partner?
How do you communicate with your parents, your children, et cetera,
et cetera, and those that's where they see and experience
(03:15):
the difference is how you communicate with them. And so
we analyze the communication. And that's evergreen. Right, as long
as you have the communications, whether it's email or text
or phone calls that you recorded or videos, that's all
we need. So the person whose past never had to
have recorded anything as long as we have some samples
(03:36):
of their communication, we can analyze what the patterns were
that were unique to your two's relationship dynamics.
Speaker 3 (03:43):
You know, there's a there's this survey.
Speaker 2 (03:46):
I've seen it done a bunch of times, and the
percentage may vary, but that if somebody got us forty
percent forty percent of who we knew we are and
forty percent of how we think. If they got us
forty percent, then they really did know us. But that
(04:07):
still leaves this huge part of us that is not
known to that person. And we might feel a kind
of kinship, almost a kind of twinship, maybe even with
somebody else, but they don't really get us. They get
a part of us and they and there's all sorts
of psychological surveys about relationships and communication that would back
that up.
Speaker 4 (04:29):
Well, you know, I think that's interesting. One of my
philosophies internal is we have to think small to think they.
Speaker 1 (04:36):
I would.
Speaker 4 (04:38):
I would say that no one really knows anybody in
that thing. Most people don't know themselves. I don't know.
People are taking mental note of the fact that you
change dramatically throughout the entire day depending on who you're
interacting with.
Speaker 3 (04:54):
That's really true.
Speaker 4 (04:55):
There's just no way you go home to your wife
and you speak to her the same way you spoke
to your drinking buddies or your or your colleague or
your boss. I mean, they're just very different people.
Speaker 3 (05:07):
Or you do that at your peril, right right.
Speaker 4 (05:12):
If you get home and start talking to your wife
the same way you talk to the guys down at
the bar, you may not have a white longer.
Speaker 2 (05:17):
And we actually have a rule in our house too
that you have to bring the best to the nest.
Speaker 3 (05:24):
That's our rule.
Speaker 2 (05:25):
And so you can't just give the world your best
and then come home and be all grumbly and unhappy.
The point of every day is that you come home
and you recharge your batteries at home, but you still
have to retain that best part of you. You can't
take it out on the people that you love the most.
Speaker 4 (05:45):
I mean, I think that's absolutely beautiful to start, and
the world will probably be a better place if everybody
you know, adopted that. But I think that really, you know,
I started noticing it for myself when I started analyzing
my own communications. You know, there's all kinds of varying answers.
I mean, really from the tone of my voice, from
the tempo, the speed in which I respond, the language
(06:07):
I use. You know, you know, everything changes. I mean,
it's just so drastically different that you know, you could
put how I'm hanging out with my buddies when We're
riding motorcycles and going to bars and you know, doing
whatever we're doing to how I, you know, would be
speaking to my mother.
Speaker 2 (06:26):
I mean those are night and day people, right, or
business meeting or you know, going to the dentist.
Speaker 4 (06:33):
Doesn't really my or when the sec is listening to
the radio. You know, I'm seking totally different than I
would be, you know, so it's different, it's a different person.
Does that make it any less mean, not at all,
you know, it's it's just a different mean, it's a
different version of myself.
Speaker 2 (06:48):
So how then, what is the process then for capturing
those data points of your personality that could create a versona,
a liable versona as you call them, that would help
somebody in a grief process.
Speaker 4 (07:08):
So you know, what we do is we analyze data.
I mean it's looking at data points. So the proprietary
nature of what we do is we have thousands of
data points and I think a lot of people get
freaked out by the idea of AI and then machine
learning underneath that, and you know, really all that is
(07:29):
is we're just telling when we go into code a
program to work, all we're doing is saying, hey, and
if you find something else, add it to the code base.
Not nothing crazier than that. So it's basically what we
say is, here's a hundred data points to start that
are unique to these communications, that vary from sort of
(07:50):
universal natural language models. Right, find something different about these
what is unique within these two communications, and as you
find other patterns that don't exist in other places, add them, right,
And so that what happens is then when you when
you when you apply a VERSONA to a natural language model,
and you could use any of them, it's drawing from
(08:12):
a set of data that is completely unique between these
two people. And so the answers and the responses and
the questions and the the prompts come out very uniquely
to what the surviving person is expecting to hear. And
and that's I think what makes it so special is
(08:34):
that it's it's literally thousands upon thousands of thousands of
data points you know, learned by our by our technology
in a matter of minutes, and then when you go
to speak with it, it behaves in the way that
you expect it to behave.
Speaker 3 (08:53):
So I'm not freaked out by AI, and I'm not.
Speaker 2 (08:59):
I I respect a lot about what you're doing and
what this application can mean. I don't think, however, that
I have seen evidence yet in general, not this application,
but in general on the value of AI and machine learning,
(09:19):
which is resulting in students not even writing a paper
where they just as you you know, as we say,
type in some data points, hit a button, write a paper,
go back to the bar. That's not an education, and
it's fairly transparent. I don't think they see it that way,
(09:42):
but you know, and we can make all the comments
we want about you know, they're only treating themselves and whatever,
but are you really getting an education by doing that?
Speaker 3 (09:52):
And at some point that might just show up, believe
it or not. And I'm I'm.
Speaker 2 (09:57):
Less excited about that aspect of A than I am
about this. What I like about this is it actually
replicates using technology a fairly traditional way of understanding place
and family.
Speaker 4 (10:18):
Well, yeah, absolutely, and then that's you know, we call
our users community members. I mean, there's almost everybody in
the company is related or has personal relationships, so we're
very about familial bonds and replicating that with technology. But
one thing I would say about AI is it ultimately
(10:39):
it's artificial intelligence as an umbrella, is just allowing for
computer software to integrate new information into its code so
that they can do new and better things. So when
we think about the value of AI, I mean, you know,
I think about cancer research, I think about what we
are doing, you know. I mean you can have something
(11:02):
that's continuously learning and finding better ways to do it.
I mean, that's the real value when it comes to
certain you know, I think really more what you're you're
thinking about is you know, natural language processing and things
of that nature. Yeah, there's some questions around it. I think,
you know, my attorney friends aren't super enthusd by this.
But when you think about a lot of the things
that a program like artificial you know, like a chat GPT,
(11:26):
let's say, which I think I love what open AI
is doing, and but if you think about the access
that creates, you know, not everybody can afford four or
five hundred dollars an hour, whatever the going rate for
an attorney is to get some advice before they have
to go represent themselves in a small claims court or
before they sign a contract. And so I think there's
(11:49):
there's a ton of really valuable applications for that kind
of technology that goes far beyond the sort of the
media frenzy about students are cheating.
Speaker 2 (11:59):
Now, it's not a frenzy man, it's not a frenzy man.
Do you just gotta you gotta trust me on this
as a professor, you just gotta trust me. It's not
like media friendly, like we all lost our perspective. I
teach media and this is a problem and it's going
to get worse. And when we even with the idea
(12:23):
of these open sources for say, for example, it's one
thing to have like a zoom where there's a format
and there's a but the amount of the kiting of
information and the way things are going to be pilfered
here and there, I'm I would say your company and
(12:46):
your idea is a great one. I'm not sure you
would enjoy it so much if somebody were coming along
and they were taking out little bits of it in
an untraceable way and then using it to create their
own work. When you did the all of the original
heavy lifting.
Speaker 4 (13:06):
No, for sure, not. I mean, but I also think that,
you know, one that's already happening. I mean, let's be real,
but happening. It's already happening us. You know, we've already
seen people popping up and using some of the same strategies.
And we could run around, and you know, we we
have our technology pack, and we could run around and
(13:27):
start filing lawsuits and whatever. But you know, there's two
things for me that really matter. The first is if
people are applying this and doing something good with it,
if it's if it's alleviating loneliness, if it's alleviating grief,
and we're not getting a cut of that financially, it's fine.
I can live with that.
Speaker 1 (13:45):
I think.
Speaker 4 (13:47):
The other thing is that, you know, we've put in
an extraordinary amount of work in building a trust with
our community members and with users and being out in
the world and explo mean why we've done this. You know,
it's not easy if I'm being honesty and to tell
the story of my mother dying, to tell the story
of myself almost dying over and over and over again.
(14:09):
But I feel a responsibility to be vulnerable and open
like that, so that people feel safe using our technology
and understand our motivation. And so, you know, I think
ultimately people will use our product, and people will feel
comfortable with the product because they trust us as a
company and they know why we're doing it. And and
(14:31):
to your point, you know, I was a high school teacher,
and I was I was long done teaching before you know,
the Chatchy pet revolution. But certainly I've seen many of
many of the techniques of cheating, as we all do
as educators. And I hear you, and I'm very firmly
in the camp of you only cheat yourself when you cheat.
(14:52):
And ultimately, you know, I think that especially with something
like media, you know, nobody is paying too much attention
to what grade you got in school. So if you
cheated your way through it, and you can't, you can't
put together real or you can't put together a piece
(15:13):
of journalism that blows an editor in mind, it doesn't
matter anyway, you know. And so I always sort of
feel like, you know, charmerically and universally, and you know,
when you do things with good intentions or bad intentions,
the results are going to reflect that. And I don't
overly worry about that sort of thing, and I understand
the good far outweighs the bad.
Speaker 2 (15:35):
And I just I want to hear you feel that
confident about a team of surgeons who fake their way
through med school, who are now looking at your broken body.
And I'm not trying to exaggerate. And this is not hysterical.
This is an ancient problem. This is an ongoing problem
(15:56):
with people who go to and they can they can
come up with the same credentials as the next person,
and then the only way they're going to as you
point out, you know, it's one thing if you're just saying, well,
you know, media is a talent driven business. So if
you've got the talent, you'll rise. If you don't, you know,
(16:16):
then it will reveal itself over time.
Speaker 3 (16:19):
Well do you want to be that ninth patient?
Speaker 2 (16:23):
I don't think so, and I and I think there's
a there there are enough problems right now with it
that I just think we're way in front of any
kind of backstop to it.
Speaker 3 (16:38):
And that's what concerns me.
Speaker 2 (16:40):
And so it's not that I think the whole idea
is terrible and I'm not being some old guy with
a hose saying get off my lawn. What I'm saying is,
like anything, we need to always be looking twenty blocks
down the road, not just three blocks behind us. And
(17:02):
that's the part I'm worried about, not this though, because
I really like what this is a tool for helping
individuals manage grief. And like I said, it comes from
a long tradition right of people, whether it's even if
we just looked at it photographs and that was one
that was an innovation in and of itself, was to
(17:24):
take a photograph of somebody who had died so that
you would have that to look at them, so you
didn't have.
Speaker 3 (17:29):
To rely only on your memory. It's really that kind
of a grief.
Speaker 2 (17:37):
It's sort of a grief manager. It's sort of a
tool for grief managing that has a that goes back,
you know, hundreds of years.
Speaker 1 (17:45):
Listen to more Coast to Coast AM every weeknight at
one am Eastern and go to Coast to coastam dot
com for more