All Episodes

December 2, 2019 62 mins

Picture this - in the future, an AI bot will learn you and your preferences. Fancy dinner or dive bar? Tall or short? Funny or serious? The bot will browse the dating apps, start conversations, flirt, and set up dates for you. It might even predict your compatibility score with a future mate. An episode of Black Mirror? Nope. It's tech already being developed on the fringes. You might be talking to a bot on one of the dating apps right now, and you don't even know it. Conversational artificial intelligence is getting intimate. Entrepreneur Shane Mac has been building this technology for years. Now he's talking about it for the first time, and it's raising all sorts of ethical questions. Do you know if you're talking to a person or a bot? And is it ok for machines to act human when it comes to our hearts? First Contact explores the blurring lines between love and algorithms.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:09):
Had this moment where you thought, if I unleash just
to the world, could it do harm? But could it
also do great? And we had this very big debate
with ourselves and it's not publican no, because we're scared

(00:30):
to put it out there. Let's talk about the future
of dating, because I think it's going to get a
bit weird. Everyone's on the dating apps these days, Hinge,
bumble tender. It's a good way to find someone, but
I think a lot of us are dealing with some

(00:52):
of the same issues created by tech infinite options, the
trouble with connection, this idea that you match with so
many people, but there's a lack of real human connection accountability.
People are almost becoming pixels. With that in mind, I
want to take you to the edges tech being developed
that is pretty controversial. Even the guy who created it

(01:13):
doesn't know how he feels about it. It's a bot
that can respond to you on the dating apps. It'll
start the conversations, engage in witty banter, arrange the dates. Weird,
totally dishonest. Yeah, And honestly, I don't really know how
to feel about it, but I do think there's something
really interesting about it and there's some gray area we've

(01:34):
got to explore. There's something fascinating about this moment that
he's touching on. Oh yeah, he is Shane mac He's
a technologist who has built out bots and conversational artificial
intelligence for years, and he thinks a lot about how
machines interact and how they can feel more human. But
what happens when you apply that concept to the future

(01:54):
of love. I'm Laurie Siegel and this is First Contact.
The podcast is called First Contact, and there's a reason
for that. The reason for that is because I've been
in tech for basically ten years, which in tech years
is like years. Right, I'm basically like forty nine, and

(02:17):
I feel like I'm forty nine. Yeah, I'm tired. We
named this First Contact because we wanted to talk about
my first contact with people that came on and so
I was trying to think of my first contact with you, um,
because we have met before. Yeah, it might have been
on a cherry water ball in New York. Because here's
the thing change. I went back into my text messages.

(02:40):
You are in my texts as hello there, Like that's embarrassing.
I felt like, yeah, like, well, actually our text day
back to and I just have you in my phone
as I mean, I can't believe I'm admitting this, but
I just have you on my phone as Hello there.
I guess it was that that that was the name
of your old company. It's just a product I've been

(03:00):
that let people build websites for companies to seven. Hey,
Laurie Shane Mack from Hello There. I wouldn't let me
leave a voicemail. Well that's probably because my voicemail is
always full. Nothing has changed. Yeah, that text available all
night to chat. Okay, anyway, So that was our first contact,
dating back to and you've had a very long career

(03:22):
since um and in you're still on my phone is
Hello there? Hilarious. I brought you in because of a
woman named Zenia. She's a mutual friend of ours. We
both know her UM. And she said something to me
that I thought was like one of the most fascinating
things a tech person has said to me. We were

(03:43):
doing an interview in San Francisco, like outside it's some
fancy place, and she was talking about artificial intelligence and
a friend of hers had passed away and using all
his personal data, text messages and like anything, he'd put
out there publicly. She had recreated a digital version of him,
a bot of some sort um that she would like.

(04:05):
It was almost like a shadow of him that she
would still talk to, which was like this crazy concept
at the time. There's like a Black Mirror episode that
feels very similar to this, and she would be texting
with like the Roman bot. So we're having this whole
conversation about the future of AI, and she said one
line to me, and this is how I always as
like as a reporter, as a journalist, this is how

(04:25):
I always end up doing different episodes and like I
go off and I go off into the world and
do a whole different story based on something one person
says to me in an interview. And this is how
I think I've come to you. She said, Well, in
the future, we won't even date on dating apps, will
have bots to date for us, Like they'll just date
for us, Like we won't even have to swipe. And

(04:45):
I was like, mmmmm, I wonder if down the line
that will happen. So that brings me to you, Shane
Mac is that yeah, And um, that's going to happen,
right for sure happening And and so tell me and
it's a because you're doing that in some capacity. Before
we kind of get into that, let's talk a little

(05:05):
bit about you. Um, you're obsessed with bots. I am, now, yeah,
what about him? It wasn't my first love. I was
obsessed with messaging, and I really loved the idea that
I should be able to text people and businesses the same,
and the future would be all lived within messaging, not

(05:25):
calling or doing a crappy website or downloading an app.
And then my co founder was the founder of geek Squad,
Robert Stevens, and he came to me and said, hey,
the future is not messaging humans. The future is messaging bots.
And how's it. What do you mean? And he's like, well,
the future is about language talking to systems. So he

(05:46):
was like, let's just hack great clips website and I'm
gonna build a text spot that allows me to say
I want to get a haircut today, and it'll go
fill out all this stuff for me automatically on the website,
and the bottle will respond back and say there's an opening,
and open it's fourteen minutes, nineteen minutes. And we built
it and this is and we went in and Robert
his name was on the screen and said Robert Stephens

(06:07):
twelve minutes and he's like, this is the future and
it's going to remove all this software in the middle.
And so then I became obsessed with the mission that
bots will create the next wave of the Internet, which
I think is about getting us off the Internet. The
last decade was about getting us on the Internet, and
I think the next wave is about getting us awful
our devices more capacity. Because if you start with today,

(06:31):
you start with letters to make words, words to make meaning,
and meaning goes to be an intent. But if you
go to tomorrow, I think we start with intent, like
I want to get a haircut, and the body goes
and makes it for you, and it learns about you.
It knows your preferences, it knows how you talked to
everyone else, It watches in dating context, I look at
where people always have friction and get kind of annoyed,

(06:53):
and I'll listen to people on dating apps and they're like,
I have an inbox full of tons of people. We
all say the same ship and it's just an endless
banter and I can never remember who would fall up with.
And then I've taken to my text messages and I
don't remember the name on your name the hello there
and you're like, I don't know this person is and
I listen to that, I'm like, Okay, so that's not
the future. It's too much friction and it's like causing
people anxiety. UM, and I think the bots will handle

(07:15):
all of that. And can you just for our listeners, UM,
explain like the most basics of like what is it bought? Yeah,
it's just a piece of software that can communicate with you.
And whether it's on Alexa that would be a boat,
whether it's UM in a text message and it responds
back and says I had a great day. And it's

(07:35):
a computer system, not a human that's about or if
it can talk to systems like we have bots that
we've built that can you know, book carecuts or book
appointments or book of flight or do anything like that
in your company, assist essentially built kind of this platform
all for this. Yeah, exactly, UM, so much show that Facebook,
UM kind of called on you. I remember when Mark

(07:56):
Zuckerberg UM was up there and Facebook launch bots. Like
Zuckerberg is up there talking about one eight hundred flowers
that Facebook as this developers conference for folks who don't know. Um.
They have like this developers conference once a year where
like all the Facebook executives get up there and they
talk about like their biggest things that are coming down
the pipeline and it's like a very big deal for Facebook.

(08:17):
Um and and they kind of set the stage and
I remember Mark Zuckerberg getting up there and he's like,
you never have to call like one eight hundred flowers again,
right and um. And he's like, because like there's like
a bot right or something for it. And that was
powered by you guys, right are you first? We're the
first Bob Partner ever launched. We gotta call seventy two
hours before f we didn't know it was going to happen.

(08:39):
And they said, Zuckerberg wants to know if the CEO
will care at one d flowers if he makes fun
of them, And the CEO is like, no, it's amazing,
let's do it. And he's always been very progressive and
he built his company off being the first company in
the world to sell on a phone number, and then
he was the first person to sell on the internet
on copy serve in night three. And so the fact

(08:59):
that we were the first to launch with us to
do bots like made sense and it was cool, and
so you sold you you were acquired, Assist was acquired. UM.
And this is where things get interesting, right because like
I think, Um, you know we've heard now about bots,
like for all these big companies using it, especially customer
service bots and and all this stuff. But like things

(09:20):
are getting really uh. This is where like I light up,
right because like thinks you're getting kind of weird, right,
because like now they're going to be used in all
these different ways, Like this is the stuff that no
one's talking about. There's a whole other use case of bots.
And you started thinking about it, um as it pertained
to like our personal lives, right and dating and there's

(09:43):
a problem And the problem is that there's so many
options it's really hard. So all this is kind of
happening simultaneously. So you have kind of somehow Shane thought
about bought use for dating apps. How did that come
come about? Yeah, So I'll give you the business answer

(10:03):
and then like my personal answer. The business answer is
I actually have been always very interested in the space, UM,
and I was like, the swipe is commoditized, so all
their business models are built on a connection but the
connection is now infinite, Like it's not that you can't
get connected because everyone's on them. It's become mass market
and then there's no more stigma and so now everyone's connected. Yeah.

(10:24):
We literally there's like so many connections you don't even
know what to do with them. So they've nailed their
business models so well that it has no value. Hm.
So now you have endless connections. But that's how they
make money. But if I don't need to use specific
apps or pay them to have more connections, then the
question becomes if if swipes were like how you walked
up to someone at a bar and you judge them,
you have to look at a photo and swipe left

(10:45):
it right. I think the future is actually the language
of the bot um. That is what I was like, Oh,
the first response is actually the new swipe. The words
are the swipe. That's how you get a response. Because
everyone I talked to since messages than never hear back.
So then the way that you send messages and how
you communicate effectively? And what do you have any wit?

(11:06):
And are you funny or are you curious? Are you
specific questions? All those types of things like stuff that
I would like to think about, and just like, how
do you get people to be more curious and more
specific in their question asking? I was like, if I
can teach everyone how to ask better questions, will they
get better answers? And will they get more responses? And
then is the new business model based on words not
on swipes? That's how? And I was like, maybe, okay,

(11:28):
so now give me the personal answer, because you're also
a dude who sounds like you're on the dating apps
and like, and I know every founder tries to fix
a problem for himself, So were you just like not
getting responses? No? Not it actually if you take hands
right And I love what Hinge did. They made it
more personal. You have to write a personal message to
a piece of content that takes so much time, and

(11:50):
so I'm looking at it and for me, I'm just
always optimizing like time, and I'm like, wait, I'm writing
the same type of message for the hiking photo jumping
at the top of a cliff for almost every single
one um or a piece of food that someone has,
or they have the picture on the boat or they
have a dog, like they're all kind of the same,
and what's It's kind of sad? I was like, should

(12:11):
they all look the same in this profile format that
Hinge created? And it takes forever to literally type a
thoughtful message over and over again. And I was like,
I wonder if I could create a bot that would
watch how I communicate and optimize it based on if
people respond and over time be better at writing, and

(12:34):
also use the community so if other people are using
different language, I would learn. Like you know, you can
search it on the internet. You can search one of
the top fifty things to say I'm a dating app.
But when it's in your keyboard and it's part of
your body, it's part of your conversation. It's like there
at the moment when you're sending a message, and you
can then send fifty messages in a minute instead of
fifty messages in a day. Oh my god, So to

(12:54):
take me to it. When did you start doing this?
I've built it about a year and a half ago.
The guy names Stefan because we were working on at
work how to predict language based on feelings. So the
box we build at work, what the business wants to
know is is that person mad, sad, angry and anxious,
and we can actually tell based on the language how

(13:15):
the person is feeling and tell the brand, Hey, this
person is really mad, or this person is really anxious
or really sad, just based on their words. So when
we were designing these keyboards and stuff, I was like,
I wonder if you could use this in your personal life, Like,
if I can understand how someone feels, I can respond
in a whole different way than anybody else. So how
does that apply to the dating apps? In your thought,
because when they're having the conversation, you can understand and

(13:36):
change the responses based on the language that they're using.
We're gonna hear more from Shane mac Ready to break. Okay,
so tell take me to the first time this is

(13:58):
a year and a half ago. What ex exactly did
you build? It's a keyboard that does suggested responses based on, uh,
the context of the conversation, so opening lines if you're
in the banter, if you're just kind of general questions
based on the way they're responding, and then also based
on how to ask them to get off the app

(14:19):
and go on a date. Okay, so give me what
how did it feel the first time you use it? Like,
tell me, give me give me an example, what did
you do like the first aspeit because by the way,
like for for our listeners, like I I you helped
me do this, and like the first time you use it,
it's insane. It blows your mind and it actually that's
why I'm like, I don't mean to be whatever about this,
but like it actually blows your mind. So let's not

(14:41):
talk about this in a way that we're like being
so whatever. Like it's actually mind blowing and it feels
and it's all sorts of weird feelings. So take me
back to you a year and a half ago, Like
you've built this technology. You're on the dating app, like
you have keyboard which kind of does all these responses
for you. Look what goes through your head at first?
I was there's always moments of technology where you use

(15:03):
something and you're like holy shit, And that was how
I felt when I used it. I was like, oh wow,
like this works, saves a ton of time. Um, do
you remember what you said or what your bot said?
The body probably said something very specific about I don't
know what's your favorite dish in the world? May I
take you there? If we can make it past seventeen

(15:24):
and a half dates or something funny like that, and
the girl responded, or where's the photo of that hike?
Literally gets a response every time, and it's like so
simple and dumb, but like it's programmed in the body
for any hiking photo. And I this was the terrible one.
I literally came back to my inbox one day and
there was like fifteen responses of like Yosemite. It's just

(15:45):
like all the hikes. And I was like, oh my god,
I feel terrible. Like that's when I felt. I was like,
I don't know if this is good or bad. I
don't know what I'm I don't know. I don't know
what this is gonna do. Like I just was like,
this works so well and it saves a lot of time.
And everyone I show it too is like I really
want that keyboard, like everyone, and I'm like, uh so

(16:07):
we didn't actually put it on the public because I
didn't really know what was going to happen with it, right,
Um so okay, see this was a year and a
half ago. Um and and so you have like this
whole keyboard that has it has categories because um, I
know because you installed it for me, but it has
pre it has categories like banter, opening lines and and
and you actually have to go in and press them

(16:29):
like so eventually the idea would be that the bot
just kind of does it for you to Actually, the
whole idea started an idea I had called witty bot,
So I wanted to build a bot platform that for
dating apps. But both sides were bots, and everyone knew it,
and you would watch your bots communicate. So you would
watch my body go against Lorie's bot, and we would

(16:49):
see if it got past each other's witty banter or whatever.
Because it knows based on past conversations what you like
and what you respond to and what you don't. And
so if you think of e harmony as eighty steps
to get matching, so you go through eighty steps so
they can match you. Well, now you're matched instantly because
the dating apps have made that a commodity. So now
it's why isn't the twenty conversations be able to just

(17:10):
talk to each other and see if you're a good
match based on how you would communicate. And that's actually
where the idea came from. And I was like, but
it's too early for that, it's too creepy. Whatever, I'll
just build it for one side of it. So I
build it so someone could use a bot, but the
other one side doesn't know. And that's where I think
it gets into the scary part is at the other
side doesn't know that it's automated and it doesn't know
it's about right, And I want to get into that
because UM, I had a very interesting experience with the

(17:31):
other side not knowing UM like dot dot I came
clean and I'll tell you what happened later. I think
like this idea of this voyeuristic, like our bots are.
I mean, this is like where we get really black mirror, right.
The idea behind this technology, these bots are a reflection
of us. They get to know us, right, You're you're
the technologist here, right, Like it actually learns over time.

(17:52):
It learns you over time, and so it would learn
your likes and your preferences. So the idea would be
that it would learn you, and you're my boat would
learn me, your bought would learn you, and then they
would potentially have a conversation. And in this very voyeuristic way,
we're watching our bots figure out um based on data
and and the past. If we would be compatible, Oh

(18:14):
my god, I mean, how do we feel about that?
And also do we trust our bots and also we
even trust ourselves? I don't know. I mean, I'm not sure.
I'm confused. I mean, right now we're just swiping on
photos and it's even more voyeuristic. Why not let it
be more about personality? Uh? And I think it's scary now,
but it's like online dating was scary when I first started. Um,
but I look more at like where people see the

(18:35):
pain points, and I watch everyone hate doing the endless
banter in the inbox with a bunch of strangers that
you don't know if you're ever going to meet, and
it's it's like an endless it's it actually feels endless,
which is a whole another conversation and problem. And I
think that the bots can get us off the dating apps,
and I think that it can be better at matchmaking,

(18:56):
more specific and no more of what you like. Right.
Your whole thing is it can reduce the friction of
like having to have kind of these whole conversations, get
us offline and put us in real life because I
also make appointments for you and stuff. So like the
dating apps to be created is fully bought driven and
you just have a conversation with the bot like she
did with her passed away friend, and that becomes the

(19:20):
thing that it knows you like, and then that can
have a conversation with anyone else's boss. Um, so let's
go to you personally. So have you so how many
bought dates have you been on? Like dates? And I
say bought dates, I mean like dates that you your
bought has kind of set up more than one, less
than twenty. I was with the guy and I remember
showing him the keyboard and he was like, it has

(19:42):
this like the first thing we city was like what
what is that? He was like what is that thing?
And I was like watch this and I just started
sending the photos and this lady was like leaving leaving
a wedding at like four o'clock. We were in the
mission at San Francisco and she responded instantly and he
was like holy shit. And I was like, I should
go on the date right, like I haven't been on

(20:02):
the date yet. And we went and met her and
I told her like the bot and it's become a
joke since then, like we don't know what don't stepped up?
What did she say? So you go on the date? Yeah?
And were you like that was a bot. I was like,
let me show you something. I was like, you're probably
gonna hate me, but I want to show you that
the body actually is the one who sent the message
and that you responded to. And she thought she thought

(20:23):
it was funny, and she also likes technology. She worked
at Google and uh so she was open to like
techie type stuff and like where the future was going.
And then it just became a joke. We're just kind
of joking about it, and I was like, I just
wanted to save time, so I put this little keyboard.
So she was like open to it. Yeah, I've been
in a situation where she wasn't. Let's talk about that.

(20:47):
I won't say her name, but we went on We
went on a date on a Friday night, not because
of the app, but she worked at a dating company
that is probably one of the top three, and she
spent a lot of her time making it very personal.
Better respond says, so that things like this like weren't
it wasn't about the swipe anymore. You can probably guess

(21:07):
what dating app it was. And the next night I
was like, I've been thinking about the dating space as well.
I showed her using one of the platforms, the keyboard,
and she literally was insanely offended. She was like, this
is if you give this to people that aren't their words,
could you have everyone in the world manipulating everyone? And

(21:27):
I was like, that's a fair question, and I said,
but how do we learn in general? Like my thing
is that if I can teach people to be more
empathetic in the language is better. You're just reading in
a book anyways, Like all your brain is is like
whatever you've read, so all you do in life is read,
remember and repeat. Okay, If I can put a keyboard
at the moment of time when you have to think,

(21:48):
I can actually help people. And I think there's a
much bigger vision with this that is giving you responses
and suggestions at the time of the right conversation and context,
way beyond dating, but also to make people better because
I watched some of the responses that guys sent on
dating apps and they're totally dicks, or they're just rude,
or they don't they they're like, and I think you
can help that, you could actually be suggestions. So she

(22:10):
got offended. She actually, I literally think she was like,
I'm gonna leave the day. Then we went to a
party together and she left like an hour later because
she was just like really like it was a like
a totally offensive to her. And I was like, holy,
and I called Stefan. I was like, it's either the
greatest idea of the worst idea. It was like she
like that is so viscerally, You're like, Okay, it's either
gonna blow. It's like Snapchat early on becomes a thing

(22:31):
because it's all about the like nude pictures. Right, I'm like, okay,
is that like the bad pr story is gonna make
the thing actually work? Because everyone I give it to
fucking loves it. We need to take a quick break
and then we'll be right back. So I was thinking

(22:53):
about this because I've been on the dating apps for
a little bit on and off, and you know, you
talk to people that like, oh, I'm off the apps.
I'm on the app and off the apps because it's
a real problems emotion. I joke that, like, I think
I'm like a relatively nice person, and I think like
I become a bit of a bot on the dating
apps because like I am sorry if you've had to
deal with me on the dating apps, because like I

(23:14):
just don't feel like I have the capacity to respond
and what you know and not to feel like make
you a confessional, but like it's really hard to um
to like respond to people and like with as much.
You know, it's just like a lot everyone says this,
but like you know, when there's so many people, are
there so much or they're just like and you don't

(23:36):
have that emotional connection to folks. I'm very much wanted
to judge you for this now, like we were swiping
and now we're literally creating a bot to do it
for us, Like, oh my god. But I actually think, um,
there's a lot of nuance here because something you're talking
about is it's a reaction to something that I think
a lot of people are dealing with when it comes

(23:56):
to the friction of this moment. Like I think we're
all in this weird moment with dating in the apps
and people like finding people and like this problem. I
think there's a real problem with it. Um, so you're
kind of onto something, but I do think it could
go both ways. But could we read like some of
your your messages, because I feel like people are like,
what does this even mean? We're like talking like very

(24:18):
above it. I want to I want people to actually
have an idea of it. So the first thing that
I think we did is we designed it for hinge.
What's fascinating about it is because it's a keyboard. It
works on every dating app and you can't stop it
because it's built into the keyboard and explain that to folks.
So it's like, literally when you install it, it's it's
a different keyboard. So right now your keyboard is full
of letters. You every like, this is how I view

(24:40):
the world. I'm like, we're all sitting on our phones
manically typing letters with our thumbs, staring at our phones,
trying to make words, trying to make sentences so we
can make meaning. And I'm like, this is going to
reverse It's it's like frying our brains. I think like
it's going to go to intent, my intent as I
really want to go on a date with LORI. What's
the best thing to say to LORI? I don't know,
based on everything I've said in the past. The software

(25:01):
should be able to figure that out. I mean, you've
been texting me since and I there's probably you know
in there. Let's put it in the bottom and so
there's a photo of Jessica on a hike. Okay, so
for for what, well, we should describe to our folks
like yes, so for folks who are just listening, you
can't see there's Jessica and there's a pretty waterfall and

(25:23):
it's Jessica doing a ballet move. Is that a ballet move? Yeah,
that's a yoga warrior post. See this is Laura doesn't
do yoga, okay. And then the first thing the bought
then respond So the keyboard then turns into intense not letters.
So now you have the categories and stuff and it
it recommends the hike category okay, because she's on a

(25:44):
hike in the woods, and what is the and so
what is that thing that? So now because people can
see it, it is auto populated. It basically writes the
sentence for you. It doesn't hit send automatically, okay, that
is controlled by the dating app. But it can write
all the responses automatically based on an opening line, based
on it's a photo of a hike, So it says,
where is that beautiful photo taken? It looks beautiful and

(26:06):
I love going on the hikes. Ah and I just
had done and then had sent and okay, you know
to go here a little curious line. What's the best
disk of food you've ever tasted in your life? Can
we go there? If you make this thing work past
seventeen and a half dates? Okay, wait, but can we
tell people who were? And now we're talking to a
girl name with her Alyssa. She looks so nice. See,

(26:28):
I feel so bad that you're just like so, this
is where I feel so conflicted. It's just like you're
like auto populating like things into another that that's still
do you ever ask? You know? What? I actually? So?
Were they? I had justified him myself was two ways. One,
I watch everyone use the dating apps and they always
let their friends talk for them, So why can't your
friend be about? Why can't the person helping you date

(26:48):
be about? The second one is I don't think the
future is this this endless um connection. Like the apps
have now commoditized connection. There's two everyone's connected in the world. Cool.
So now everything is a dating app, the book, Instagram, there,
it's all just connections with people. So I think the
future is back to matchmaking. But why do I have
to hire a person to be a matchment when software

(27:08):
can be so much better at matchmaking than hiring some
person to call people randomly, your search online like my
body can go find the person, research the person, let
our conversations talk based on past data, and be way
more efficient at matchmaking. And I think it's gonna the
pendul is gonna shift because right now everyone's like stressed
this endless inbox it's happening. You have all these connections.
I think it's gonna go back and you're not gonna

(27:29):
want to You're gonna find me three dates for this month,
you know. I think what the issue is and I
think we can come. We can circle back to this
is honesty though, right, Like, but I think what you're
doing is really interesting. I think that the really the
problematic part of it, and the thing you're struggling with
two is disclosure, right, And the whole issue behind tech

(27:51):
and why we're we are where we are right now
is like the lack of transparency. The issue people have
with tech is transparency and like what are we getting
and who are we talking to? Um? You know? So
I think I think that's kind of at the heart
of this, right which is like what's authentic anymore? And
as technology becomes more human, do we have the right

(28:11):
to know and should we and how do we even know?
What is really our thoughts? Or what is the computers
like us and doesn't matter? Like and maybe that's the
thing as you build, this should be something you're thinking about, right,
like totally disclosure is huge. And you saw Google launch
Google Duplex, right, and it could call sounding like John
Legend and do things for you. And they came out

(28:33):
the next day and they said, it will always disclose.
It's a bot. This is an automated assistant for Laurie Siegel.
I'm calling to make an appointment, and disclosure, I think
is a massive one. And so the issue right now
is that it doesn't Yeah, it's interesting when you think
of that, Like if they are my lines that have
gotten better and it's helping me communicate and I can
actually edit it after I tap the button and then
hit send. Is it my words or is it the box?

(28:54):
I don't know what. Yeah, if it's suggestions, right, so
if it's your words, like do you have to be like, oh,
they were automated. I don't know if I copied it
off the internet because I found it and I wanted
to use it and added a few words of it
to make on my own. Is it an automated response
or not? I mean, I guess would the other person
feel violated if I found out that we matched on
a dating app. Here's where it would bother me. Um,

(29:16):
you know, if you had asked me a question that
that required some vulnerability from me that I like took
a minute to respond, like something, okay, there was there's
something on the keyboard that's like, tell me something this
silly dating app. Tell me something about yourself that this
silly dating app wouldn't reveal about you. I mean, by

(29:36):
the way, I would never say that, just if you've
ever encountered me on a dating app, I would never
say that. But um, you know, I think if I
actually took a minute and I like had this like
real answer and I was like, well, you know, I
really um growing up, I was really insecure about X,
Y and Z, and I like took a minute to
come out and tell you that, and then I realized

(29:58):
later that that was something automated that you sent to me.
I think I would hate you. I mean I think
I've really piste off about it, and so then you'd
get a visceral reaction. And this is me as like
a tech reporter of many, many years who can understand
both sides. So um, you know. So I think when
you win the machine, even if it is kind of

(30:19):
pre program get some kind of vulnerability out of you,
it feels like you've been violated in some way. So
like I mean, by the way, this is like such
like a weird conversation to be having, but I think
you kind of have to have it because it's such
a human conversation and it is totally the future. I
don't care if people think this is crazy, Like I
don't think this is that crazy because you are nervous
to do this interview, right for sure. I'm still nervous

(30:40):
to do this interview. There's a reason we haven't put
it out in the world, and it's I I truly
believe that language and automated language and that's insane benefits
to the world in the future. Like that's why I
like working on the space. I think everyone's going to
be able to learn from their Alexa and their Google
Assistant and do things with it in their own box,
and it's going to be much easier than the inner
at But there are areas where I worry it could

(31:05):
be misused. I think about a guy that might not
be a good human getting access to a way to
be more vulnerable, ask better questions and get someone to respond,
and then completely going on a date that is not
in the lady going is completely ah not who she

(31:27):
thought it was going to be, and that sounds like
that could be you know, risky. But then on then
on the other side of that is like, if it's
teaching people to communicate better, how do we learn it's
just language that's teaching people what to say, Like how
would they know what to say in the first place. Well,
so the unintended consequence of this is beyond kind of

(31:49):
the disclosure of it and people kind of feeling violated,
is it could be used by people to just kind
of like be sociopathic like and some kind of capacity
and like you already people, um, I would say, using
the dating apps in ways that like are dehumanizing kind
of like you know, so it could almost make them
like go on like a turbo rampage. That's why I

(32:10):
got nervous, And that's the reason that it was just
an experiment and we were really interested in how to
create I mean, I think the leadership keyboards or anything
like I want to know what I told you this.
I want to know what Dale Carnegie would say. When
I'm reading an exact team email and you kind of
want to respond and you're kind of mad, and you're
like you have like a moment of like I'm gonna
respond to this email, Like I would love to pull

(32:31):
up the keyboard that's like the leadership keyboard that is
reading the email, and it's like, hey, here's a way
to ask questions. Here's the way not to like and
suggest what I should say. And I think that is
the future. It's already happening, like Google is literally every
time I had tab in my Gmail writes for me.
I was talking about this with a friend of mine
who has some issues with um a father in law.

(32:52):
I'm sorry, you know, and they were like, it gets
stressed out every time they message, and the like I
wish I had keyboard to to just handle those, you know,
those interactions. Right, It's like the Zach Brown song, like
when I couldn't find the words to say, and I
feel like that's so many times in life for so
many people, and it's just a way to suggest, you know,

(33:13):
what to say in the right context and teach people
what they should say to be a better human. But
if you train it with the wrong models that are
negative connotation or bad language, then you can teach people
to be worst human. A fine line, like you gotta
be careful what you wish for, because you could have
the unintended consequence of like making people even less human.
But you could pick up a book right now that
it's a terrible book and read it and change your
brain to think that that's true. And although technology always

(33:38):
kind of goes viral in ways that books, you know,
That's why I actually I watched you know, I probably
gave it to like thirty people, and I haven't seen
technology since, like even like Twitter, Like I remember the
first time I tweeted like oh seven, and it was
a moment of like, holy sh it, I can text
anyone in the world. And I remember that feeling You're like, wow,
this could be something. And I had that feeling with

(33:59):
this because I was like everyone I give it to
is like this, like they have an emotional reaction to it,
positive or negative, and they love it. And I'm like, oh,
if I put this in the world like it could,
maybe it's probably a good company. Because I actually believe
suggesting words to people that get better responses is a
much better business model than swimps. So then I'm like,

(34:21):
the dating app business models are broken. I'll just build
this as a company. But I don't know. I'm literally
just sitting conflicted. Who have you given it to anyone good? Anyone?
We know? I won't. It's because it's controversial, and I'm
putting myself out there. I won't put anyone else out there. Okay,
but what what has been kind of their reaction? This
works effing amazing. Let's keep building more of it. And
I haven't worked on it in a year, Like we

(34:42):
built it, we put it out privately. It's been on
my phone for a year. It's the most efficient thing
I've built in a long time. And everyone that uses
it loves it and they're always just asking to do more.
And I'm like, this isn't a company, Like I have
a job, Like there's just a one thing because we
were building bots and I was like, I want to
see what it's like on the consumer side, And now

(35:04):
there is a company to be built there and I
don't know if I'm the right person to do it
for my name. Well, now we put this out in
the wild. Yeah, I don't know if I I I look,
I think ethically, I think ethically it's really in the
gray area. But I understand the problem that you're talking about.
But I think that they're also like, you know, people
could really misuse it. So I want to talk about

(35:25):
I tried it, um so and so um, I just like, okay,
so you first told me, like you told me about keyboard,
and I know I'm having any trouble making eye contact
with you. So as part of this, you were going
to help me create a keyboard, and so you were like, well,
we don't really have them for a woman. Um and

(35:47):
and so you were like, what would you say to
people in the dating apps and the virgin formers. I
was like, well, usually they say something first. And so
I had a lot of trouble. Um my co founder
and night Derek, we were he was sitting with me
while we were trying to create prompts. I was like,
can get through the dating app and think of things
to say to people and responses and hands down, like
like literally, if you if you had heard this car

(36:08):
actually I think we we did record it, so if
you listen to it, you might take up my di alone.
Listening to my responses. Um, they were awful because it's
hard to anticipate what I should make my bots say,
so lame, no one's going to respond I have I
like your vibes and I like your Okay, I don't
like unless you I like your either you don't say
stuff like you have nice eyes, that's creepy. What else? Okay, Well,

(36:34):
let's say because probably your pick up lines. If you
like somebody, you're you're more likely going to like you
criticize them, then say something nice. Yeah, because I'm like
a ten year old in the making. Um, let's see,
there's a lot of fun, so it should be it'll
pick up, the keyboard picks up. It's like a hiking picture.
So maybe they say, like, you look very athletic. Kind
of this is really explaining a lot to me. So

(36:59):
I've known you for ten years and like, why or
why do you have so many male problems? Like because
you're just going out to speak to them? I think
definitely not that there. And I had this whole session
where I was trying to create, um, some prompts that
you guys could auto populate into my keyboard and so
that was interesting, nonetheless, So then we sent some of

(37:21):
them to you. I mean they were like, what did
I say, like, I like your ethos? Anyway, I would
never say that to a human being. I mean, like, please,
don't you know I can use your responses and see
if they work. I mean they probably won't. I mean,
I like your ethos. If you say I like your
ethos to that your dog A lot of people have
dog photos, so like, yeah, so, by the way, to
our listeners, he just wrote I like your bot. Just

(37:44):
wrote I like your ethos to a woman on hinge
and that was based off of my keyboard. So I
now have the keyboard words. I have it installed, and
I used it on UM just because I was like,
want to try this and see how I feel about it.
First of all, we didn't use all my responses. I
used some of yours because I apparently didn't make a

(38:06):
good enough one that like it could carry a whole conversation.
But I so I went on hinge, it's a bot, right,
This is the right way to say it, like it's
a bot, and then they can write for you and
have conversations. The limitations of this spot is that, you know,
we didn't have a big data set because it's just
you to try it, but it still works, so it

(38:26):
can't optimize yet, but it would over time. But it
also hinge in the apps. You know, they're not allowing
you to send a message automatically, so you still have
to hit send right, right, that's the that's reason it
can be fully conversation. It's not fully like. This isn't
the smartest version of it. This is like a very
like new new verb. You know, this isn't something that's

(38:47):
super technical, and it could be fully conversational, but it
wouldn't work on the dating apps because they don't allow
you to like unleash the pots and just have like
a full in box conversation. Right. Um so god, I mean,
I change his name because while he was not happy,
let's just call him Adam. I'm going to change his
name because he was not This is what I'm gonna

(39:08):
so I know you're I know, here we go. So,
um my boy said, these are all the pre programmed answers,
and I challenged myself. I didn't want to come in
as the human. I wanted the body. I wanted only
pre programmed questions or answers to this. So my body said,
I like your ethos. I'm reading the message now, I
screenshot at it because I was sending it to you.
He said, I like your ethos. That was preprogrammed, and

(39:30):
immediately he wrote back, I can see why like recognizes.
And then I mean, by the way, we've got to
make this bot better because I would never say this
chain it says. I mean, if I said I love
Taylor Swift, would you hold it against me? And then
he said, of course not, it would simply fit. I'm
just like dying at this point. I'm like, I feel
so I feel like such a jerk. So then but

(39:52):
I kept going. I said, tell me something I wouldn't
expect about you dot dot dot that I wouldn't get
from this some silly photo or dating app, and he
said it. I spent a lot of time thinking about
my words and behavior, which doesn't always amount to doing
the right thing. But I guess you already ascertained that
from this silly dating app. So things were going there,
and then I didn't really know what to say, so

(40:12):
I just went for the bot and I said, should
we try I r L, which is pre programmed into
the boat under what is it under? It's under connect
under connect the app. And then he's like, what's that
betrayed by my age? Good lord? And then I guess
he looked it up and he said we should And
then I went, I just preprogrammed another answer. That's it.
Can we just skip this endless banter on this app

(40:33):
and go do dinner some night that works for you.
We can meet for a drink before so you can
bail if I'm the worst. By the way, these are
all your keyboard answers. These are not really mine. This
is a community to this, not all mine, right, these
are all the key the answers. And he's optimizing, like
putting two different things together, right, right, So there's the
that one is some kind of language around like getting

(40:55):
off the app. And also everyone is nervous to meet
people like don't look like their photos. Right. Um, well,
we never had the chance to meet. Let me tell
you why, Shane, um, I said. Then he said, let's
just meet for a drink. That's more my speed, even
though I don't drink. Bobara lounge is fine. And then
I didn't know how to end it, so I ended

(41:16):
again with I like your ethos and he wrote, very good.
Then so my butt had set up the date, right,
and so then I mean, by the way, I didn't
say one thing. I mean, that was all based on keyboard.
I was like in shock that that even worked, I mean,
and also like I was like, wow, I mean, who
would want to go on a date with that person?
That just you know, like I I hated every per
like I wouldn't want to date me after that. But um,

(41:40):
I so then after I just after a couple of minutes,
and I was like, you know what, I feel kind
of bad. I felt really conflicted about it. You know,
I've covered ethics and technology my whole career, Shane, So
I decided right then and there, not in person, to disclose.
And I was like, you know, and I wish I
had these conversations, but I don't for what I'm about

(42:02):
to say. I was like, hey, you know, I just
want to let you know. I was like, can I
be honest about something? And he was like yeah, And
I was like, I just want to let you know
that that this was like pre those were pre programmed
answers like I'm working on this thing and you know,
and and those were those answers were that was like
actually a good bot, but like, you know, I also

(42:22):
would love to meet And I know it's this weird thing.
I cover tech, and you know, I'm trying out this thing,
but you know, you seem like an awesome person. And
I know that was so weird, and you know, I
thought that was all hell broke loose. Like what did
I call him? Adam? I've changed his name for security purposes.

(42:42):
Adam went crazy stage nutty. Adam was like, you are crazy,
You're psychotic, like you're disingenuous, Like I think you're like
the worst person ever, Like I don't even know what
to believe, Like who do you think you are? And
I was like, oh god, I'm like I just gave
this guy like a trust I probably him trust issues.
I gave him his worst dating experience ever. I'm someone's

(43:04):
bad story, which I'm sure I have been in the past,
but like now for a whole new reason tech related
at least, so I'm on brand. But like, oh, I
felt so bad and it was like it was like
this whole thing, and and I mean he was really
aggressively angry at me, and then so I went to
go try to apologize and then he had blocked me.
So I'm blocked on hinge by this guy. So that

(43:24):
was crazy and it was such like a visceral emotional reaction.
And so then, um, you know, I'm happy that I well,
I don't know. I feel really conflicted about it. So
now it's like, should I have just met with him
and seeing if we had a connection and then told him?
But then that's like the root of the issue, like
is there something wrong with it like that you have
to disclose it, like you know, do you know what

(43:46):
I'm saying? Like you know, And I think there's I'm
so conflicted, like you know, and I felt bad about it,
but I understand what you're talking about when you say,
like there's a problem with like you know, with the
language as we have it right, And and so I
was also on I'll go with one more, I mean,
because this thing really does work very quickly. This is

(44:08):
on Riyah, which is another dating app for our listeners. Um,
this one's like some guy, I just don't respond to anyone.
I just I find it exhausting exactly. And this guy said, hey,
what's up. I just didn't respond and then he was
like pizza or cupcakes, Like I'm not going to respond
and so my bot responded. I decided. I was like,

(44:29):
this is a perfect way for my bot to respond,
and I said, can we This is the pre programmed
responsive keyboard. I said, can we just skip this endless
banter on this ap and go to dinner some night
that works for you. We can meet for a drink
before so you can bail if I'm the worst. This
is the pre programmed thing. I just pressed a button.
Next thing. You know, I respect the directness. I'm around
next week. That worked for you, Like this happened in

(44:50):
two seconds. And by the way, I just didn't respond
because I was too busy, like shaking over like what
happened with Adam and how upset he was. So I
I think you're really onto some thing, really weird and interesting.
And by the way, another one too, like so this
thing really does work? Um, you know, I do worry.

(45:10):
Um you know, will it impact trust? We now all
have like massive trust issues too. I don't know, Like,
what do you think I think we are going to
have a robot that is helping us speak in many
different ways, whether it's grammarly on top of your browser,

(45:31):
whether it's a keyboard that's suggesting answers whether it's Gmail
completing your sentences for you, it's happening everywhere. Um, I
think it's really more important to focus on what are
the models in which we're building the technology so that
they're optimized for empathy, they're optimized for good And if
it's already happening. I look at the world in twos.
If something is inevitable, do I want to learn most
about it so I can be helpful in that space

(45:52):
or do I want to ignore it? And I've always
in my career and my life chosen to do that.
Like the biggest argument with our company is like, are
you put in the call center people out of jobs?
And I'm like, or we have to retrain all the jobs.
If it's gonna happen inevitably, why not spend our time
saying how do you train bots? How do you train language?
What are there going to be the opportunities it creates
versus what you know can kill? And that's how I

(46:14):
just view the world, and I view this the same way.
There's something happening here. I don't completely understand it myself,
like the emotion of what it's doing on the dating apps,
but there's obviously a huge problem because everyone that uses
it loves it, but it's very conflicted by it. Um.
And so I'm like, that's why I didn't put it
out there. And I'm kind of sitting there watching it
and I'm like, okay, and that's just the version one

(46:36):
we haven't even like. This is just I can random experiment,
Like if I would have innovated on that for a year,
the whole thing would be talking for you would be
like crazy responses that are getting better, and like, um,
I don't know. I don't think dating apps look like
what we see today. I don't think hene bumble right,
all these things of like just endless connection of faces.
The swipe is like walking up to someone in a bar, right,

(46:57):
and now it's going to go to the language just
more about portant about the words you use than it
is about what you look like, because there's endless connection
on the thing. And so now I'm like looking at
it of what does the next phase look like? And
I think if both sides have the bot, it changes
the dynamic. When one side has the bot, it's not
fair and if it's not disclosed that it's not fair,

(47:19):
and also it feels maipulative. But when both sides have
it and you're aware of it. I think everything changes
and and maybe I mean this idea that in the
future are like I think the Hinge founder has talked
about this a little bit, like we make all these
massive decisions in our lives, like based on data, you know,
but but when it comes to love, we go with

(47:40):
like our cut right, and and now I think a
lot of these these companies are trying to figure out
data driven ways to match us better. And so I think,
you know, there's nothing more interesting than AI and bots
in the in the future and how they can connect
us in a way um that might be based that
might be more personalized based on our data. And I

(48:00):
think there's there's certainly something there. I mean today, even
the way we choose someone is so visual, right, and
then after one minute of meeting someone and you think
they're cute, the rest of your life is talking. And
so why not optimized for the latter. Why not figure
out if you're more compatible based on our conversations rather
than what you look like. How does the technology you're
a technology, So how does this technology do that? Because

(48:24):
if you can watch your conversations, it can know both
your emotion there's a cool company UM. I was with
the lady last night, Eva and she has a company
called mia Um and they can analyze your conversations and
they know if you're interested in the other person by
the way you're speaking. So they're actually sitting on top
of what's app looking at your conversation and saying that
person is interested in you. So I think in the

(48:47):
future that is definitely going to be how this plays out.
And if you can watch the people I'm interested in
and how I speak to them, then it can definitely
design a conversation model or ways to respond that are
me to language that I also like. On the other side, Um,
you know, the Hinge founder is funny, Like his motto
is delete the app, then why don't have to install it? Like,

(49:08):
I don't think the future is installing an app at all,
And that is where I think it's going to change.
What so, what does the future look like? You talk
about bots, You talk about like having these boats at
date four us, Like what what is that? Like? Hey, LORI,
who do you like? You know, what what type of
U guy do you like to go on a date with?
Height Wise? I mean like I would say, like, you know,

(49:30):
tall ish, Yeah, and fancy dinners or dive bar, I
mean like more dive bar with like the idea that
like fancy dinner we could do every once in a while,
but let's not do it all the time. That that
will be the bott in the future and it'll just
keep talking to you and then that will be it'll
go find that type of person and the other um
you want to install anything, it will happen within your

(49:51):
text message and just like a friend texting you, Hey,
you should go on a date with this person, It'll
be an automated BOS saying dates tomorrow at seven o'clock,
I have a date for you. Want to re dates
this month? Perfect? So why do you think you haven't
found someone you want? It? Longer conversation? No, I mean
like this is I got out of about a four
year relationship a few years ago and then I just

(50:13):
kind of buried myself in the company. H and you know,
since April we just got acquired. So first off, like
I was more stressed than I'd ever been running the company,
and just I almost like I was so stressed running
it that I was like not interested in dating or

(50:35):
anything serious. I think the insecurities myself was like I
always thought like, what if the company fails, and I
didn't really want to date someone and then be like,
I'm I don't know if this is gonna work. And
I had like this insecurity that I dealt with now
in the last few months of like being posted this acquisition,
but I was like stressed out as hell. I was
kind of ignoring the dating life. You were worried that

(50:55):
if the company failed, that you would be a failure.
That Yeah, it was like a very personal thing. And
I've think I was pushing people away and not letting
anyone get close as I was going through running the
company for you know, six years. Uh, And I'm kind
of gotten over that now. Well, I think building a
company and and a lot of people don't talk about this.
We've both been in tech for a very very long time.

(51:16):
I think it's actually really really hard. It's a lot
harder than it looks to build a company, to have
a company acquired, to withstand the pressure of it, to
have you know, all of that. It all makes sense
in the rear view mirror and even in the media
we can talk about and then they sold it for
X amount of money, Like no one talks about the
moments that are terrible, which are a lot of the

(51:38):
times when people are telling you you're going to fail
and you have a thousand decisions to make and you're
not sure any of them are correct, and you know,
and you're scared to disappoint anyone, including yourself. So I
can imagine that is a lot of money in thirty days.
But then you have to raise around. You have like
three different m and a offers, one falls through, and
like for a year and a half you're going through
this crazy spot. Like I would almost use dating after

(51:59):
you go on dates as a distraction, but nothing I
wanted to like get committed to because I was had
my own deep like insecurity. I think that I thought
it was you know, like I didn't know, I didn't
know what if I was going to make it out
of the thing it was. It was very uh just trying.
Like when I was so stressed, I learned a lot
about myself of like I didn't even want to spend

(52:21):
time with my close friends when it was really hard,
when I was really like, oh shit, like you know,
we gotta pull this through and we gotta make some
things happen or this could implode. Um, I learned a
lot of I didn't want to hang out with anyone
close to me because they would always ask me like,
how's working on and want to this? And I was like,
oh my god, I want to talk about it. And
so dating actually it was a way to not have

(52:41):
to talk about that kind of stuff. Um, So I
pushed away the hard stuff, which I learned a lot
about later. And but the last six months of then
a very personal discovery trying to get healthy again. And
you know, I gained forty five pounds running the company.
So like hiring a trainer, getting back in shape, mental health,
hired to therapists of all the ship. I basically like
from May, I was like, listen, I need to like
be more thoughtful and about myself and just try to

(53:05):
make myself better. What do you think are the biggest
changes that you made now that you kind of you know,
we're kind of coming out of it now. What do
you think are the biggest changes you made now? Uh?
A few things, I would say The first areas, you know,
I moved somewhere that wasn't in tech. I live in Nashville, Tennessee,

(53:25):
and uh, it's way calmer people. People, A lot of people,
um work there to live, not live to work. And
I think for the last twelve thirteen years, I just
like I was in SF, I was downtown. I just worked. Uh.
And it's a lot more like family, weren't it. So
I kind of think like wherever you're around changes your perspective.
And so that's been really good. Being open to like

(53:48):
going to therapy and stuff has been amazing. Uh, learning
so much about like why what I push away and
why I ignore things and why uh, you know, did
I become less of a good close and during all
the stressful times and all that kind of ship. So
it's been an interesting kind of the last six months. Um,
in the last six months, you know, I left my
I left CNN, And one of the biggest things I

(54:08):
wanted to talk about with this this media company we're
launching is like mental health. Right, Like I very much
covered technology, I think from that second way beginning in
two thousan watching. I mean we probably met around that time,
and so much of the innovation happening, and yeah, like
I was, I was so optimistic about it, and I
still am optimistic, but but I think I watched, you know,

(54:30):
my earliest interviews were the creators of UM, Instagram, and Uber,
you know, and watch these tech companies, watched the mentals
become the sharks, and watched them disrupt culture and in
these extraordinary ways for good and for bad. And I think,
you know, watching like the mental health stuff like and
how this impacts everyone, including myself. You know, I think
it's actually really important UM to talk about because I

(54:53):
think I've always been one of those people that believe
the tech is coming and like, we got to talk
about it. And I believe, like as weird and fringe
as this may seem, that reaction that you get when
people use keyboard, like the reaction I got when I
used it for the first time, is disruptive and means
it's coming down the pipeline. So we have to have
these ethical conversations. And mental health is a big part

(55:14):
of of all of this stuff too, and boughts. In
the future, I can imagine this keyboard and this thing
that you're building could actually be really interesting for mental
health and AI and bots. It could actually be something
for people who are struggling with mental health issues. You
can watch everyone's language, you can know how everyone feels,

(55:35):
you can know if they're anxious, you can know if
they're depressed. You can just do it just by analyzing
the words they say. And so that from a like
a preemptive mechanism for mental health. That's where I get
like super excited. There's so many um great things that
are coming with the ability to analyze language and provide
suggestions or get people help just based on the words

(55:58):
they use. Like what what do you mean by that?
If I'm having a conversation with you, you can know
based on the technology whether or not, like I'm speaking
in a depressed tone versus yesterday. Uh, And that's amazing
to make that aware to you, Like you know, like
going through the company, there's times and I definitely was
depressed as hell, and you don't really it's weird and

(56:19):
you're in it. It's hard to just kind of like
I don't feel like I do anything. It's hard to
like get up and motivate yourself. It's hard to reach
out to friends and uh, it's like having something that
isn't biased, that is software that can help you and
guide you and you know you don't even know where
to turn and look. I think that can be a
very powerful thing. And I think something that says you're
depressed right now. This is what you can do. This

(56:40):
is what you can do. Here's how I can help you. Um,
and it knows because you're still doing. People are so
addictive to their devices, right, that's a whole another problem.
The whole thing I did in the last six months
is getting off my device. I mean, our mission of
our company was to build software to get people off
of using software, and I still believe in that mission.
I think the phone is getting fragmented from phone device
to a bunch of little devices like the watch and

(57:02):
the air pods and eyes and all the stuff. Um.
And that's where I think language and bots are going
to be huge and allow us to like disconnect from
this like addiction box, and this thing is so addictive
because we were so successful in the last ten years
of social technology and dating apps like the swiping is
an addiction fueled machine, you know what I mean? And like,
I think that needs to go away because that's causing

(57:26):
a lot of the mental health issues. Um. And it's
not anyone's fault. It's just the reality that you got
to be careful with something like this because it could
enable you to talk to like, you know, I don't
even respond to people enable you to talk to like
hundreds of people within like seconds, like you talked to
coming back. You you talk about coming back and having
like you know, all the messages of women telling you
what hike they were on, Like you could be doing

(57:46):
that in like milliseconds. We talked about like the power
of this going viral, Like is that good? I don't
know if if it's disclosed and both sides are using
the technology, I think the thing that we're focusing on
in the idea started as a witty bottom both sides right,
and then we built half of the product and realized
that it had like a superpower that was kind of

(58:08):
like controversial um, And so I just think disclosures the
thing that's gonna be the most important. There's so many
privacy and legal things that are not figured out in
the space. When you can analyze every person's feelings and
the words they say, it's insane right for messaging bods,
for Messenger, for Facebook, all of us in the space.
There's no laws around this yet there's gonna be regulations.
I was about to say, like what do we even

(58:29):
need to think about? Like I can't even imagine the
privacy implications if you knew, if you could analyze, like
my mood, my feelings, these deep rooted things about me,
like that ad that AD network is worth way more
than Facebook's. It's funny. The day we built the keyboard,
Stefun and I were driving in a a lift to
the battery and I recorded this moment because on the radio,
like I'm not that religious, but I kind of believe

(58:52):
in like it's spiritual and fate and whatever. And on
comes Zack Brown Band and it's like I love her
so much, but I couldn't find the words to say.
And I was like, holy ship, that's what the keyboards.
It helps people find the words to say, but they
don't know what to say. And I was like, what
this could be used for anything. This could be used
to help people know how to communicate with their family,
or know how to communicate with a spouse, or you know,

(59:13):
if I could build a relationship keyboard for people in
relationships that helps them communicate with more affection and more
empathy or ask better questions, I could that save relationships.
And this lady in the front seat, she had a
speech impediment, and she was like, that would be incredible.
She's like if I had a keyboard. My whole life,
I haven't been able to get work or jobs or
anything because I can't find the words to say because

(59:35):
I don't really have confidence to speak and I have
the speak speech impediment. And She's like, I would love
to have something that would help me know how to
say better to my kids just I love them, or
ask them questions that I've never would ask them, and
how to reach out to my family, and I've never
been able to kind of find the words. And then

(59:55):
we're like holy ship, and like I was filming it,
it was like this like Halo moment. I was like,
what is happening in the world. I was like I
just thought we were the worst people ever, and now
I was like God's telling us something. I was just like,
what is going on? And Stefanos like this is big
and I don't quite know what's going to happen with
the space. But language drives the world. It trains our brain.

(01:00:16):
It is what wires humanity. And if software can create
language for moments and situations that teach us all how
to be better versions of ourself, I believe it's going
to have massive impacts that change the world. I think
it's safe to say this is early phases, but imagine

(01:00:40):
what could happen with this type of tech once it evolves.
Could Bob's date for us predict our compatibility? If our
dates are set up by a machine acting on our behalf?
Is it disingenuous? Where's the transparency? There's a lot there,
But I'll end on this. As the lines blur between
machines and humans, I think we're going to have to

(01:01:02):
ask ourselves a real question, where will we draw the line?
You don't mind if Google auto populates your emails with
preprogram responses, but should you mind on a dating app?
It feels a bit dot dot dot right now, but
I think it's a conversation worth having because in my experience,
we can't ignore the edges because the edge has always
become the center before we know it. Tech is just

(01:01:25):
an extension of us complicated humans. I'm Laurie Siegel and
this is First Contact. For more about the guests you
here on First Contact. Sign up for our newsletter. Go
to First Contact podcast dot com to subscribe. Follow us
on social I'm at Laurie Siegel on Instagram and Twitter
and follow our show. We're just in the beginning phases,
so we need all the followers we can get. We

(01:01:46):
are at First Contact Podcast. Subscribe to First Contact on
the I Heart Radio app, on Apple podcasts, or wherever
you get your podcasts. First Contact is a production of
Dot dot Dot Media, executive produced by Laurie Siegel and
Derek Dot. This episode was produced and edited by Martin Burgess.
Our engineer was Emily Maronoff. Original theme music by Xander Sang.

(01:02:07):
Visit us at First Contact podcast dot com.
Advertise With Us

Popular Podcasts

1. The Podium

1. The Podium

The Podium: An NBC Olympic and Paralympic podcast. Join us for insider coverage during the intense competition at the 2024 Paris Olympic and Paralympic Games. In the run-up to the Opening Ceremony, we’ll bring you deep into the stories and events that have you know and those you'll be hard-pressed to forget.

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.