All Episodes

September 16, 2022 52 mins

What was HitchBot all about and what happened to it? From hitchhiking across Canada to a violent end in Philadelphia, we tell the robot's story. Scott Benjamin guest hosts.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to tech Stuff, a production from I Heart Radio.
Hey there, and welcome to tech Stuff. I'm your host,
Jonathan Strickland. I'm an executive producer with I Heeart Radio
and how the tech are you? It is a Friday,
which means it's time for a tech Stuff classic episode.
This episode is called the Sad Tale of hitch Bot,

(00:28):
and in many ways it's another example of comparing the
cultures of Canada versus that of the United States. Scott
Benjamin was on hand to serve as a guest host
in this episode, and it originally published on August twenty four,
two thousand fifteen. Let's listen in let me paint you

(00:48):
a word picture Scott. On July two thousand fourteen, Hitchhiker
began a historic journey from Halifax Nova's Scotia to get
to Victoria, British Columbia, on the other side of Canada.
We're talking crossing the entire width of Canada. And if

(01:11):
you were to do that on the most efficient route possible,
if you got to choose the route, that would be
at minimum around three thousand, six d forty four miles
or five thousand, eight hundred sixty four kilometers However, hitchhikers
rarely have the ability to call exactly what route needs

(01:32):
to be taken. They are at the mercy of the
drivers that pick them up. I'm just I'm headed west, exactly,
take me as far as you're going. And not that
I condone hitchhiking or anything like that. That's kind of dangerous.
It can be, and we will definitely get into some
danger territory. Part of this conversation, well, the full trip
took closer to ten thousand kilometers or about sixty two

(01:54):
hundred miles. Uh. And here's the weird part. The hitchhiker
was a robot. That is weird. Yeah, not a person.
I've never picked up a hitchhicker. In fact, for a
long time, I had never even seen one. Uh saw
a lot in Hawaii. Surfer culture is still going strong. Yeah,
there's certain parts of the United States that you can
expect to see more hitchhikers than than other parts, right,

(02:17):
and uh, in Hawaii, I guess would be one of
those places. I've been there too, and I know exactly
what we're talking about. I think I've seen more hitchhikers
in Hawaii standing at bus stops attempting to hit your ride,
just hoping to catch a ride before that bus shows up.
You know, it's like one or the other. Eventually that
bust will come. But if in the meantime, if I
can just get a free ride down the road, That's
what I'll do. I've just seen lots and lots of

(02:39):
surfers trying to trying to get to the beach, trying
to catch that next wave. That's right, man, you can't.
The waves wait for no one. And so we're talking
about Hitchbot, which a lot of you have probably heard about.
Hitchbot made the news first made the news in two
thousand and fourteen during this historic attempt to get a
robot to hitchhike across all of Canada successful. It was successful,

(03:02):
so spoiler alert there, and then went on to do
this again in Germany. It went all over Germany, and
it also took a little vacation in the Netherlands. And
then finally there was an attempt for this robot to
hitchhike it's way across the United States, which was cut
short just like the robot. This all sounds so nice.
What could possibly go wrong? So you might be wondering,

(03:25):
I've heard about this what actually is going on. The
first thing I want to say is, while we're talking
about a hitchhiking robot, Honestly, if I were to describe this,
I would not have used the word robot. Yeah, it's
it's they're using a term very loosely at here. I
think it's because the form factor makes it look sort
of like a robot. Uh. And it definitely had the

(03:48):
benefit of having like a light display that that made
a very simple smiley face, so you had kind of
a you know, ahead that you could identify. But really
we're really talking about a hitchhiking computer. It's just a
hitchhiking computer that was in a more or less static
robot body. It's simple, but they found a way to
anthropomorphosize this thing to the point where people look at

(04:10):
it and said, oh, that's kind of cute. Yeah, because
it has a torso, it's got arms and legs. Arms
and legs don't move. The torso is static. Torsos a
bucket um literally buck yeah, and literally a bucket. It
does have solar panels that are arranged on the outside
of the bucket, so that's one of the ways that
the robot gets electricity. The other is that it will

(04:33):
ask people to plug it into the lighter socket on
a in a car. Yeah, it's only about three feet tall,
so it's very small. That's waterproof, which is kind of surprising.
It's waterproof. It does well and also, I mean it
wears waterproof boots. Every single article, every article without fail,
talks about the fact that it wears Wellies. The brand

(04:55):
name boot. Yeah, okay, like a like a rubber boot.
It's a named to the Duke of Wellington. Well It's
arms are what pool noodles, so you know, it's not
it's not like they went to two great lengths to
try to make this thing look human or anything like that.
But it does have arms and legs. It does have
a face as you mentioned, has um A. It almost
looks like a tupperware container on the top looks like

(05:17):
a hat, a beret or something like that. That's designed
to to actually protect the the electronics, the tablet computer
that is running the software that the robot so used
likely part of the waterproofing overall, I guess, and um
it's GPS equipped. The thing weighs about twenty five pounds total,
so it's not that heavy. Really has its own built

(05:38):
in seat that it's like a car seat almost that well,
it is a car seat from a kid's car seat
that you can then put in your car and buckle in,
so it's secure when it's in there. It's not gonna
fly around the vehicle loose, you know, it's something were
to happen, right, And and it's legs are not powered,
they are statics. So the way that you would set
this up when you are done carrying it as far
as you want to carry it is the seat also

(06:01):
has essentially a lever that can fold down into a
tripod like position, so the two legs act as two
of the legs of the tripod. This lever access the
third and if you were to pick it up, you
could fold that arm back up against the seat, so
that would allow you to put it into your vehicle
and secure the car seat there. And that's if you

(06:22):
were in an area that you wanted to set it
up on the side of the road that was you know,
like a field or something where there's nowhere for it
to sit, like on a bench or maybe in a
wall or something right that right, and uh, you know,
like I said, it was running on essentially a tablet
PC that if you looked at all the equipment according
to the website, uh, it cost about a thousand dollars,

(06:42):
maybe a little less, And that was a calculated decision.
They wanted that. They, being the team behind this, and
I'll talk about them in a second, wanted the robot
to be inexpensive enough where it would not be an
obvious target for someone to just steal the components out
of it. They wanted it to be uh uh accessible.
They wanted it to be cute. They wanted it to

(07:03):
be something that people would want to interact with and
to have enough of an ability to have interactions, including
holding a conversation sort of ye kind of yeah, uh,
we're being real generous with the term conversation. That's one
thing that you and I talked about off air is
that every conversation we've ever seen with this, you know

(07:23):
that between a human and hitchpot it was awkward to
say the least. I mean it was it kind of
it kind of picked up on what what the even
was saying, but not entirely didn't quite get the gist
of the conversation, and it would respond in an awkward way. Yeah,
it had It had a microphone so it could pick
up on what people were saying, and a speaker so

(07:44):
it could then return and communicate back. And a big
problem with that was probably the the vast array of
dialects that was dealing with, right, and just the fact
that our spoken language is incredibly plastic and adaptive and
there are so many different ways to say the same
thing that it can be difficult for you know, it's

(08:05):
like a non native speaker of any language, you might
be taught how to say or ask for something a
very specific way, and that specific way is still correct,
it's not incorrect, but it's just one way to say that,
to express that thought. And in most languages there are
lots of different ways to express the same thought, and

(08:26):
you're familiar with one of them, so if anyone comes
up to you and uses a different one, you could
be completely confused, even though you know one way of
saying it that you understand all these other ways don't.
Same thing with computers. If you train a computer using
machine learning on what certain phrases mean, that's great, the
computer might be able to identify that. But if someone

(08:48):
were to ask for the same sort of thing but
word it slightly differently, that can be enough to throw
a computer off, because these are subtle things that we
humans can into tively grasp, but computers lack intuition. And
there's not only the word problem. There's also the way
that it's said. So the you know, the the regions,

(09:09):
the zones, like you know, here in the South, people
talk different than they do in the Pacific, or the
Pacific Northwest for the or the Northeast or you know,
the South, the Northeast talk faster than I can hear. Yeah,
so it's different. It's different. Not only um, you know,
just the different languages, like of course this thing had
to learn German, had to learn uh, you know dotch

(09:30):
I guess I had no French to get through all
of Canada. Sure, yeah, exactly right, and of course English,
and you know, and not only that but the different
dialects along the way. Sure yeah, So lots of challenges here.
But the whole goal was not The goal really, I
don't think was to have a robot hitchhike from one
end of Canada to another. That was sort of the

(09:51):
the face of this project. The actual goal was more
of an artistic expression as well as an experiment in
robot human interactions. Yeah, because the thing could have been
picked up right at the right at the start, right
at the very first day, and driven all the way
across by one person, you know, on a long haul
truck or something, and what would be the adventure and

(10:12):
that what would be the fun. The idea was that
this relies on um, human interaction and human kindness to
get this thing from one place to the next, kind
of to take care of this thing and along the
way have kind of a checklist of things that they
wanted to do. Now, I don't know if the Canadian
trip had a checklist. I don't think it did. Uh.
The USA trip did have a checklist, which was was
started um. But the Canadian trip um, you know, just

(10:35):
for instance, you know, twenty six days to get across um,
you know, the entire nation there. But it did things
like attend to wedding, um, it was dancing in Saskatchewan.
It it met some of the Canadians First Nations people, uh,
some of the Aboriginal people um that were um, you know,
native to Canada. And it did all kinds of things.

(10:56):
I mean, it went to you know, parks, it went
to scenic locations, um. All the time. It was snapping
photographs because this program to take a photo every twenty
minutes and it could tweet out that information, so it's
interactions were uh doubled, and that it could interact in
person to people who are around it and actually attempt

(11:17):
to have a conversation and interject. In fact, I read
one report of people who had picked up the robot
and they said, yeah, it was weird. There were three
of us in the car plus the robot, and we
would the three of us be talking, and the robot
would interject and interrupt us and often say something that
is completely not connected to any of the rest of

(11:38):
the conversation. And I thought, I've written in cars with
people like that, you know, just just someone just pipes
up with a total non sequite or and you think,
are are we in a car with a crazy person?
What's that have to do with the price of eggs
in China? Yeah, something along those lines, And so, uh,
you know that that was definitely part of it. I

(11:58):
think there was certainly and an element of let's see
how humans treat robots and also let's see how we
can design a robot that from the get go is
meant to interact with humans. Because One of the things
we're starting to see increasingly in technology is the robotic
sphere and the human sphere colliding like there and and

(12:22):
by design, we want robots in our lives. People have
named their room bas for example, they have given their
room bas uh. You know, they've they've imprinted upon them
this idea of a personality when something happens to the room.
But you know, I'm sad, you're going to be well,
i'd be sad to be out, you know, two bucks.
That's another there's that side. But you know, if something

(12:46):
happens to Fred, yeah, don't name your room BA. Right, yeah,
but that's true. Is a lot of people do name
these these robots. People get emotionally invested in these machines.
And so there is this growing field of research of
human robotic interactions. How can we one capitalize on this

(13:07):
need for humans to have an emotional attachment to these
these otherwise emotionless beings, these beings that lack of consciousness,
lack emotions. How can we capitalize on that so that
the interactions are are useful and meaningful in some way,
even if it's only meaningful for the human if it's
impossible for it to be meaningful for the robot, that's

(13:28):
okay if it's still meaningful for the human or how
do we design robots that are specifically meant to not
evoke that reaction because it would be just you know,
another distraction from whatever the robot is supposed to do.
Or maybe that whatever the robot is supposed to do
is inherently dangerous and should not you know, you don't

(13:51):
want to encourage human interaction if if a robot is
meant to do something like dig into rubble, you don't
want people to you know, worry about the robot. The
whole reason the robots digging into rouble in the first
place is likely to look for survivors in the in
the fall out of a building collapse or something, or
well just to prevent having to have a human do

(14:12):
the same thing. Yeah, exactly, yeah, yeah, and that's really
I mean, the way that a lot of robotics experts
see robots really taking off, at least in the near future,
is they'll be used to do jobs that are either
too dirty, dull, or dangerous for humans. So jobs that
are incredibly repetitive and don't require much thought robots are

(14:36):
perfect for that. Also, robots don't sustain like repetitive injuries.
You know, you do have to continuously maintain them. You
can't just expect them to work forever. But they don't
get carbal tunnel syndrome for unless they fix themselves. Right,
and we we will get to that point eventually. And
dangerous Obviously, you don't want, you know, you would want
to be able to use a robot in a dangerous situation,

(14:58):
so that you're not putting human life at risk. Yeah,
but with the goal of being able to use it
again and again and again. Right. So, but those robots
probably don't need to have a lot of human interactivity.
They're designed to do something that they're replacing a human,
not interacting with a human. But at the same time,
we are seeing this growing industry of robots that are

(15:18):
designed to be around us in our daily lives, either
as a telepresence style robot where the robot is standing
in as a surrogate for an actual person, and you
might have like an iPad or something like that as
a head where someone can skype in. And this is
always creepy whenever I see it done anywhere, but I

(15:39):
keep being told it's the way of the future. I
have never actually interacted directly with one in an official capacity,
but I've seen them at CS. Now I'm gonna I'm
gonna make a reference to something that I have nothing,
I have no personal interaction with. I believe my wife
was telling me about a movie recently called Her and
it was a man who fell in love with the
operating system and end of voice that that that that

(16:01):
operating system had. Now I can see something like that.
And let's say you get a refrigerator, and refrigerators have
screens on the Now ours does hear it how stuff works?
And it's got a screen but doesn't talk to us,
But it does have a screen. You can interact, and
there's a lot of different things you can do with
that screen, including putting a wacky background image on it,
as someone did when they put in the UM, the

(16:22):
symbol for the evil organization from loss UM. I could
see if if it was talking to you every day,
and it had a likable voice, something that you felt
comfortable interacting with um that you know, I could see
somebody saying, why I'd be sad to get rid of
that refrigerator in in five years? Well, especially if you
would come home after a long day and your refrigerator says, Hi,

(16:44):
would you like a frosty adult beverage? I mean you
know you're gonna have a bond with that machine immediately. Yeah.
So there's this whole discipline that's coming up, like how
do we how do we define these interactions? How do
we shape them? Uh? And a lot of it means
you have to do study on both sides. You have
to do a study on the robot side like what

(17:05):
works and what doesn't, and you actually have to study
human psychology how do humans respond to robots and at
what point do humans end up treating robots as if
they are alive, as if they're living creatures. And for
a while people were thinking, um, well, the robot's gonna
need to look like something biological already, Like it's gonna

(17:26):
have to be like a robot dog or a robot
you know, android type person, almost like a crash test
where it looks like a human but you can tell
it's not a real human. And it turns out that's
not necessarily true, because as we've already said, people have
been naming the room bas people get emotionally invested. It
turns out that we are if it looks animate, if
it appears to behave based upon its own decisions, whether

(17:48):
it's true or not. If it if it looks like
it's doing that, we start to kind of in our
minds give it these qualities. We'll be back with more
of the sad tale of Hitchbot after these messages. So

(18:11):
a lot of this really was studying that, like this
idea of the way people and machines are interacting and
how that is becoming defined over time and what we
might need to think about in that respect, and also
just kind of a you know, it's a happy story
about how people find joy in h a silly I mean,

(18:34):
really you get down to it, it's a silly robot,
not a bad robot. It's a silly robot. And and
the the experience of discovery and sharing that with other people.
That was a big part of this project too, and
it was really successful for three out of the four
big things that it did. The one that wasn't so

(18:56):
successful and was the United States. So um, really quickly,
before we get into the US stuff, I was going
to talk about some of the folks who designed and
came up with this idea. Uh. The two leads who
first came up with the concept for Hitchbot. We're David
Harris Smith and Franca Zeller And I probably am mispronouncing

(19:19):
Mrs Ellera Traca Zella. It's a name that I am.
I was not familiar with, totally new for me. And
they're out of port Credit, Ontario. Yes, and uh. Smith
is assistant professor at McMaster University in the Department of
Communication Studies. Zeller is an assistant professor in the School
of Professional Communication at Ryerson University Communications professors. Now, this

(19:44):
makes perfect sense because they're they're fishing for the way
people interact with this. They want to find out exactly
how people respond to this, how how in response to them. Uh,
this interaction is really really interesting for these people and
particul I'm sure right. And Zeller she got her PhD.
Her thesis was on human robot interaction, is it then? Yeah,

(20:07):
And they were joined by a lot of other people.
I've just got a couple of names I'll mention, but
the team itself is quite large. You can actually read
up on all of them on the website. It's funny
because the way the website is written, it's written from
Hitchbot's perspective. So hitch it's hitchbot saying, Oh, this is
the person who helped me learn how to talk and
it's very cute. This is the person that takes care

(20:29):
of my electronics. Yes, on a daily basis. I think
they're about fourteen or fifteen people on that team. It's
it's a big team. Is a large team, not just
the two two leads here, right, So you've got people
like a colin or a gadget who is a developer hitchbot,
and he's also a McMaster University student. He helped design
and test hitchbot to make sure it would be able

(20:49):
to withstand the various environments that it would encounter. Keep
in mind, this was summer in Canada, so it wasn't
going to have to deal with a Canadian winter than summer. Yeah,
it's a little bit different from the Atlanta summers. Slightly
slightly less warm and humid, about sixty degrees cooler fahrenheit.

(21:09):
That is, um, they're not Celsius. That would be pretty incredible.
Uh So, then you had Davin Bigelow, who was an
undergraduate student at McMaster who worked on the conversational skills
of this robot. Karen Veal birth Fish, who was another
person who worked on Hitchbot's language skills, Dominic kal Kinn,
whose undergraduate student at McMaster whose job was to monitor

(21:31):
Hitchbot's status and make sure the robot was okay. So again,
the robot was fitted with GPS and three G capability
to essentially report back home saying here's where I'm at,
being given time in every twenty minutes, he's getting a
photograph sent from this robot to him to kind of
update status where he is right now. Yeah, and then uh,
there was the big brother robot to Hitchbot, culture Bot

(21:54):
k U L t u are. But yeah, this was
this was a robot that preceded hitchpot. This was a
different human robot interaction experiment. Culture Bod's job was to
attend artistic exhibitions, take images of what was going on,
tweet them, and critique them. It was a robotic art

(22:16):
critic interesting anyway, so it would actually do the critique
on the fly. They're like right there at the event.
That's how it was described. But I didn't read enough
into it to find out how this actually worked. Like,
I don't know if it was capable of stringing together
any words, just based upon what it was seeing. I
don't know if it had human intervention where the human

(22:39):
was the one actually providing the caption. I don't know
the answer to that. Fascinating, but I do know that
that was essentially another project that was taken Uh. That
was that was being performed by much of the same team,
and kind of the Hitchpot was kind of the next step,
not directly connected. It was just one of those ideas
that that Smith and Seller came up with that they

(23:02):
thought was a really interesting concept. So after going through Canada,
they went to Germany. It had a lot of adventures
in Germany, went to castles, went to another wedding. There's
a great picture of a bride giving Hitchpot a little kiss.
Yeah and um and uh. Lots of stories of people.

(23:23):
All of Hitchpot's journeys, by the way, are chronicled on
the website. There are blog posts that tell what happened
on each day. Some of them also have embedded videos
of the stuff that went on and also photographs. It's
very cute. Then after Germany, they went to the Netherlands
for a brief while in the summer early summer, for

(23:45):
a bunch of activities and events that I cannot pronounce. Yes,
I'm not even going to attempt a series of festivals
with unpronounceable names. And then at least for the American tongue.
And then uh, and then it moved over to the
good old us of a Ya started in Boston, right,
it was gonna go from We're gonna go from Boston
to San Francisco. That was the goal. That was the goal.

(24:05):
And it also had a bucket list, which is appropriate
since it was a bucket. I have the bucket list
in front of me. Uh. Well, the bucket list has
a couple of check marks on it. Now. One check
mark was, um uh, to do the wave of a
sports game anywhere, it didn't matter where it was to
do that. Uh. The other one was to see the
lights in Times Square, of course in New York City.

(24:25):
And there were others. There's other stuff along the way,
and I'll just mention a few of these because there's
probably again different things. In Grand Canyon has to be
on there. Let's see Grand Canyon. Um Uh, you know,
I'll have to check out. Yes, it does. See the
jaw dropping views of the Grand Canyon. That is one. Yes.
In Arizona, um, posed with the Lincoln Statue in d
C was another one. Tan at Myrtle beach Um experienced

(24:48):
the magic of Walt Disney World in Florida. So it
was going to need to actually go south along the
Eastern Seaboard. I mean, for people who are not from
the United States and aren't familiar with our geography, if
you were going Boston to San Francisco, you would essentially
be setting your sites west. Yeah, just do west, Just
go yeah, just go west, and just keep on adjusting

(25:08):
your your your journey in orders for you to get
to California exactly. But with this list, it means that
you would have need you first would need to go south,
because you would have to go south from Boston well
to get to New York City, but also to d
C and to Florida. And then this goes all over
the place, I mean all over the Midwest. So there
are things to do in Illinois like explore the cloud

(25:31):
Gate in Millennium park Um, stand under the Gateway Arch
in Missouri, just all kinds of things like this. And
again it's a it's a relatively long list and with
many different states, many different activities. And checked off two
items on that list. Yeah, and the reason that only
two items were checked off is that Hitchbot met and

(25:51):
untimely demise at the hands of a vandal maliciously murdered, Yes, decapitated.
Can you just can you say murdered when it's a robot?
What I mean it was? I mean disassembled? Yeah, that's
what Johnny five was scared about. No disassembled Johnny five. Um. Yeah,
So I'm guessing disassembled is probably the best way of
putting it. At some point, you figure there's gonna be

(26:12):
another robot getting a delivery with a cardboard box, and
it'll just be what's in the box? Uh, and they'll
have the binary uh code for seven as the title
of that. Very clever, very clever. All right, So this
is a weird, weird ending. So, um, it is the
night prior to this, so it's it's what the end

(26:32):
of July, right, because I think this all went down
on August first, as when we heard exactly, so July
thirty one. Um, I think it was a Saturday night
and Hitchbot was in Philadelphia and hanging out with a
vlogger by the name of Jesse Wellens. And it is
well documented what they did in their evening during their
evening because you know, Wellens being a YouTube YouTube personality, Um,

(26:55):
it took him on the town. You kind of did
a lot of different things with him. There were other
people involved. There's another guy um there with him. His
name was ed bass Master. It was another YouTube personality
and they had kind of a fun evening. Yeah. The
whole the whole idea was that this is a I
mean it was. It was elevating Hitchbot's profile and elevating
the YouTuber's profile. This was this is a dream come

(27:18):
true for a YouTube personality because it gives you the
chance to interact with a meme while it's at while
it's happening. You're not capitalizing it on afterwards. This is
this is like a once in a lifetime type deal.
It can it can garner you international attention immediately. Yeah,
it really would because people were tracking this thing. People

(27:39):
are watching exactly where this is and they knew, you know,
when it was when it was in their city. They
knew where it was. They could walk. I mean, if
it said it's been sitting here at this corner of
you know, Maine and Elm Street for at last twenty minutes,
you could go down to Maine and Elm Street and
look at this thing, or pick it up and give
it a ride yourself if you wanted to. In fact,
the early part of the trip we didn't talk about this,
but it took a long time for it to leave

(28:01):
the Boston area. People in Boston, we're taking it to
different parks and different you know, they're taking out on
boats and things and taking you know, selfies with it,
and um, it took I think it was more than
seven days for it to get out of the Boston area,
which I think the team would have found wonderful because
it was what was happening was the robot was gathering

(28:23):
a series of rich experiences, and the people were gathering
the experience of interacting with the robot, which was the
purpose for this thing in the first place. So having
it take a really long time to get off any
area would not be considered a a an impediment on
the behalf of the people running the project. They I'm

(28:44):
sure loved it. Oh yeah, in no way is that
a failure. That's a that's a wind as fact. So
you know, here it is after this fun evening on
the thirty feet and they placed it on the bench
middle of the night. You know it's late at night
of course dark. They placed it on a park bench,
or not a park bench, but a bench on the
city street there. And that's it. I mean, you see
a cab driver arrive. Think uh. And and that's it.

(29:07):
I mean at the end of the video interaction with
with hitchpot at that point. And the next day we
wake up to the news that Hitchbot has been murdered. Yes,
its head had been removed from its torso and its
arms ripped off. And I gotta ask you this when
you saw the photograph of Hitchbot, because now this is
a dramatic photograph, I mean it did it did look

(29:29):
a lot like a true crime scene photo in that
uh and that horrific as this maybe, I mean, Hitchbot's
arms were pulled off and placed above its head and um,
he was laying in a pile of leaves. It's headless
at this point, I mean, they're they're real crime scene
photos like that. Now understand this was starting. It was
starting to feel like this is this is the robot
equivalent of a serial killer a crime scene. So it

(29:51):
was laid out in this light like the staged manner
and uh and very very showy, I guess. And there
were actually websites or you know, logs that would say,
I don't really feel comfortable showing you this image, which
is weird because here it's just a bucket with a
couple of noodle arms and some some rubber boots. Right,
if you saw this same collection of stuff and a

(30:12):
hardware store, you would just think, oh, somebody just left
their random shopping right here. That's pretty funny. I never
thought of it that way. Yeah, it's like, yeah, you
could gather the stuff up at Walmart and put it
on the floor and not think twice about it. But
now that we know that this is this uh this
this thing that they've created that has a name. Uh,
now it takes on a different twist, doesn't it take?

(30:33):
It takes a different different feel, which, again you might say,
while while it brings the the that particular part of
the experiment to an end, it also says a lot, right.
It also tells you a lot about robot human interactions
and where they can go. It sure does, and you're
right away you would think, well, of course, Wellens and

(30:53):
bass Master have done this. That's that's the two people,
the two characters that were involved with this last and
then lo and behold three or day later, two days
later or whatever, there comes to this uh secret surveillance
video that was taken it in from a nearby store
that shows what happened. And someone wanders up in a
football jersey and just kicks the living heck out of

(31:13):
this thing and destroys it. But it's it's shown just
on screen. We don't actually see the person kicking and
destroying this thing. We see him kicking in this destroying
the area of the bench that that that Hitchbot was
known to be last right, and then later uh, everyone
was reporting that this video was essentially itself was staged. Yeah,

(31:34):
it's a fake video because you can go to that
scene and look exactly where it was taken from the
same perspective. There's no camera there. Yeah, so it's a fake.
And so what's going on here? Because they never have
found the head of Hitchbot. I mean they never found
the I guess the CPU, the the thing that would
tell them PC. Yeah, that would tell them what is
what is happening? Like what happened to it? Yeah, So

(31:57):
the best guess is just that somebody scavenged it. But
you know in fact that the people behind the scenes,
the people behind the project, have said, we don't we
don't care to identify the person who did it or
why they did it. That's not important to what we
were trying to do with the taking photographs every twenty minutes.
I wonder if it captured something and sent it without

(32:17):
the the the person, you know, the purp knowing what
had happened, right, or if or if it just happened
to default within that time frame between when the photos
are taken and didn't capture anything right, and it's just
like everybody else, they don't know anything. It's possible either way, uh,
the you know, of course there was a huge reaction
to this both ways. Yeah. Yeah, there were people who

(32:39):
were saying, this is awful, this is the worst. You know,
it really reflects poorly on the United States that a
robot that was capable of safely traveling from one coast
of Canada to the other, and also in Germany and
also in the Netherlands gets barely into its journey here
in the United States before it's destroyed. That seven days. Yeah,

(33:01):
that that was a telling um condemnation, if you will,
of the United States. In general and Philadelphia in particular.
You can think of all the different comments that were
immediately happening afterwards. A lot of people would say like, well,
this happens to real hitchhikers as well, it happens to
people well, and then there were a ton of comments

(33:22):
that said, well, of course this happened in Philadelphia. Yeah,
there's that. And there's another group of people that would
just respond with something like a big deal. It was
a bucket of bolts anyway, it was a machine. Yeah,
and I mean it's it's again. This ties right back
into that robotic human interactions. You might think the experiments over.
I would argue that the team probably says, no, it's

(33:44):
still going. It doesn't matter if the robot is gone.
The continuing conversation around this is still informing us and
still giving us a lot more data about how humans
view robots, how we can identify with them, we can
imprint an emotional response to them, and in fact, they

(34:06):
have tweeted out, you know, continuing to tweet out with
the robots saying the robot still loves people, still loved
its adventures, which I think needles it even more right.
You made the robot this innocent creature. It reminds me
a lot of when um the various Mars rovers when
they were like the Phoenix Lander, specifically when it was
nearing the end of its mission, like it had gone

(34:28):
well beyond what the mission parameters were and it was
to a point where it was no longer going to
get enough sunlight to recharge its batteries, and it's essentially
the social media team at NASA sent out a final
tweet from the robot, keeping in mind the robot was
never tweeting directly. It was always a human being taking
data from the robot and then messaging it out. But

(34:51):
people identified with that robot, and when that last tweet
came out, people cried, they're they're watching this thing and
they're thinking, they're almost making it into like like it's
like you're watching a dog or something like that. That's
that's dying. That thing right there is dying and I'm
watching it happen, and they're not understanding that. It's like, um, now,
I know it's way more complexness, but it's not like,

(35:13):
well it's time to get a new toaster, right, you know,
It's not like a machine that you don't really have
any kind of personal attachment to attachment. Like if my
microwave were to malfunction, I think a lot of pain
in the butt. I have to go and get a
new one. If you're not gonna cry. No. But and
again that that goes right back into that idea of

(35:34):
how do robots and humans interact? And and we do
need to think about this because if we enter into
a world where we start treating these relationships is really casual.
When stuff happens and people go through an actual grieving experience,
we won't be ready for it. But if we know

(35:54):
ahead of time, we can say, all right, you know
what we we know this about ourselves. This is something
that is innately human, at least for many people. That yeah,
and then you then you think, all right, now I
can I can design a product and market a product
and and do it in a responsible way that doesn't

(36:15):
say this is a weird you know, aberration or anything. No,
this is a very human kind of trait that a
lot of people have, whatever elicits from the humans. That's
what you're looking for, Yeah, and so or at least
that you account for it, even if that's not the
purpose of whatever it is you're making. You at least
account for the fact that it exists. Scott and I

(36:35):
will be right back with more about the sad tale
of Hitchbot after this. In the fallout of all of this,
we've seen not just the condemnation of an act of
violence against what seemed to be an innocent creature that

(36:59):
love adventure and meeting people, or as Smith had called it,
a story collecting and story generating machine, like that was
its purpose. We've also seen people come together to say,
we can't let this be the end of the story.
Um we there are a couple of different groups in
Philadelphia that had said, let us build a robot two

(37:25):
continue on in the spirit of Hitchbot, because I can't
stand for hitchbot story to have ended in the city
that I call home. Let's let's let's have a chance
to make this right. Yeah, and we're gonna do it. Yeah.
So there are the tech community in Philadelphia has responded
with this and actually received some more or less tentative

(37:47):
thumbs up from some of the members of the hitch
Bot team to give this a go. And so there's
a group um at that gathered at the Hactory. The Hactory,
it's like a factory factory in West Philadelphia. Um, this
was this is very recent and when this happened. Uh,
And they have come up with an idea that they're

(38:09):
calling the Philly love Bot, which I don't like the
sound of that. Yeah, an odd choice for a name. Wait,
but in the brotherly love. Okay. I don't want to
I don't wanna take this in the wrong direction or anything.
But we're not talking about a sex spot. Okay, okay,
a sexpot. I didn't want to just come right out
and say it, but it seems like I've heard of
products like this. Yeah no, not not that kind of love.

(38:30):
But this is more of a love for all mankind
and robots kind of what kind of bought podcast. If
I wandered into Yeah, no, I'm not going to do
another bait and switch on your Scott. I already promised
you I wasn't gonna do that. Not doing it this time, honest.
So they said that. Um, the idea they have is
they would build a robot that was designed to be

(38:53):
passed from one person to another, so it's not designed
to hitchhike from one location to another location. There's no
location requirement, at least in their initial approach. Instead, what
they want is to design a robot that when you,
when you take possession of it, when someone gives it
to you, you are tasked with performing a good deed

(39:14):
however you define it, and it gets documented by the
robot itself. The robot you take along with you to
do whatever this good deed maybe Christian idea, and then
you pass it on. It's like a pay it forward.
You pass the robot onto someone else, and it is
their duty now to go out and do a good deed.
And the idea to kind of a tone for the

(39:34):
horrible murder of Hitchbot by promoting good deeds, and the
robot is the kind of the almost like a totem
for that. I was gonna say, I like this idea,
But you could do the same thing with a with
a carved stick. You could hand a carved stick to somebody.
Now that you're in possession of the stick, it's your
duty to do a good deed and pass the stick
onto somebody else. It doesn't have to be, uh, something

(39:56):
that collects and gathers the information. But I guess that
keeps everybody kind of this, doesn't it? Well? Yeah? And
I think also you know you I They also are
planning on having the robot, which I am going to
guess is going to be another really another computer, or
not so much a robot. They're going to have it
capable of interacting with you, just as the hitchbot could. So,
in other words, there will still be that robot human

(40:19):
interaction element that will play a part in this experiment,
but the nature of the the overall experiment, the the
perceived purpose will be different. Now. Isn't this funny? Because
I wonder what some people are going to consider a
good deed too, because there might be some comical examples
of what people consider to be their good deed for humanity,

(40:40):
like one I I yeah, I I imagine we would
see everything from someone saying, all right, I'm gonna take
this robot with me while me and my company while
we go out and we clean up a neighborhood. That
could be one, or it could be I'm going to
set this robot here on the corner so it can
watch me as I stopped traffic, so this mama duck
and her baby ducks can get across the street. It's
could be anything, yeah, like oh man, you almost spilled

(41:03):
your beer, but I I saved you. I'm passing this
thing on and see that that also again because of you. Look,
if you look at it as the as an experiment,
that's still meaningful data, right. That's true. It's it's interesting,
you know, it's kind of like a joke, but it's
also that's humanity too, That's true. And that's exactly what

(41:24):
the initial goal of this whole thing was. I mean,
it's to see what happens. It wasn't the goal of
getting this thing across Canada, because they can put in
a box and ship if they want, or just have
a trucking company hall it. Like we said, you know,
one shot always straight across. But the idea was to
see what happens along the way. It's like the journey
is better than the destination exactly. Yeah, And that it's

(41:44):
those experiences that were important and that documenting culture. And
we're talking about an emerging culture now, not just tradition,
not just the embedded culture that's been around for generations.
We're talking about an emerging culture of technolog ology and
our daily lives intermingling on a level that has been
it's unprecedented. We've never seen it like that, and it

(42:08):
grows every day, so fascinating, really a fascinating experiment. I
wouldn't call it a failure at all. I mean, I'm
sad that the Hitchbot didn't get further along in its
journey so that more people could experience it and that
we could have more stories. But it's all right, because
the story continues. It's just the Hitchbot chapter is over.

(42:31):
So when you look at it that way, it's actually
really interesting and inspiring. And you know, of course, you
might say, well, I hope that the next robot meets
with more success and doesn't have the same kind of encounter.
But if we do see these kind of encounters happen
again and again, then we have new questions to ask, like,
why is this happening? Uh? You know, what are what

(42:53):
are the motivations behind it? Are there things we need
to look at as a as a society, not just
not because we want to protect robots, but are there
underlying issues that this is just an indicator of it,
and maybe there are things we need to fix it
some real anger, some deep seated anger against robots, or

(43:14):
or even just one of those situations where clearly the
person who was trying to scavenge it wanted to get
it for the parts to sell for some reason, and
then that's well, if that's in fact the answer, then
you're you. You might say, all right, you know, this
is yet another indicator that there are conditions that maybe
we should look at and really talk about. And yes,

(43:36):
this is a kind of trivial way of highlighting that,
and it's stuff that we already know, but it's another
way to say, think about this. I mean, we're really
we're really talking about compassion, and on a level, that
is a very you know, human trait, a very innate
trait in us. Maybe we should apply that to our
fellow humans, to not just to the robots, even even

(43:59):
the even the human that caused damage to the robots,
we should show compassion too, because we don't know the
reason behind it, and there may be reasons that we
can't even identify with because we're not in that situation,
and that's all the more reason to show compassion. And
that's exactly what they're Again, I keep going back this,
but that's exactly what they were looking for when they

(44:20):
started this whole experiment a couple of years ago. So
this has really been fascinating and I can't wait to
see what the next phase will bring to us. Can
I ask you one question before we leave here? And
I we had discussed this, but only briefly and we
didn't really get into much detail. But um, had you
not known about hitchpot right, had you never heard of

(44:40):
this whole thing exactly? You passed it on the on
the city streets driving, would you stop to pick it up?
None a chance at all that I would stop because
and you made me think about this because I was
coming to it from the perspective of knowing about Hitchbot.
If I saw hitch Bot, I'd think, holy wrap, there's

(45:00):
the hitchhiking robot. We've got to take part in this.
This is something special. And I feel the exact same way.
But not knowing about Hitchbot, not knowing about it, and
seeing a bucket that has electronics attached to it, even
with the happy face, maybe particularly with the happy face,
I might think, oh, what this is like a suspicious

(45:21):
device of something. I thought, I said, it looks at
an awful lot like an I E. D. Yeah, I thought,
it's not too far off in the description. I know
that they're a little more um, I guess camouflaged in
the way that they typically do those things. But this
just seems to me like not a good idea to
pick something up, like, you know, something like this up
on the street and you know, strap it in the

(45:41):
car next to your kids. Yeah, not really, but I mean,
knowing what it is, yeah, of course you'd want to
do that. You'd want to you know, it'd be a
great experience for you and your family, you know, to
to do something with us, even if you drive it
a mile or five miles or whatever, just take a
quick photograph with it and say you were part of
that journey. That's kind of cool. It's actually kind of
interesting to me that Hitchbot spent seven days in Boston

(46:05):
because Boston is also where we had the Aquitine Hunger
Force Moon Andite bomb scare. It was in two thousand
seven and it was in Boston. Was in Boston, Yeah, yeah,
that's with the Neon or not. I'm sorry. L E. D. Yeah,
um well yeah, yeah, you can tell him what it was. Yeah,
the the l E ED They there were these two

(46:26):
characters from Aquitine Hunger Force, these two Moonites from the Moon.
They look like uh like yeah eight that characters from
a really crappy video game that they're specifically made to
look like that they're two dimensional. They're they When they
turn sideways, you don't see him anymore because they're gone.
And they're hilarious. They are hilarious. They are incredibly inappropriate,

(46:46):
as is everything on Aquitine Hunger Force, but they are hilarious.
And there was a publicity stunt where, uh, these l
e ed signs of the two characters were put up
in very locations and in Boston it caused a bomb scare.
People thought that maybe it was the indication of an

(47:07):
explosive device nearby, and so they were dismantled, and it
very quickly became kind of a joke slash a discussion
about you have to be very careful in the way
you present these kind of guerrilla marketing attempts because in
a in a post nine eleven world, they can be misinterpreted.
It was post nine eleven and pre marathon bombing too, yeah,

(47:30):
so it's kind of in between. But um, they were
on high alert there for a while about these signs,
and then you know, sheepishly Cartoon Network had to say, oh,
that was us, yeah, and I hear's what happened but
in fact they were a little reluctant to say that
was us. Well, but you know, then again bad press
is still pressed. That's true. So I'm actually amazed that

(47:52):
the that hitchbot didn't meet with any hitches in Boston
based on that um and the and as you when
you asked that question, you gave the qualifier, Hey, you've
never heard of hitchbot and you see this thing on
the side of the road. I definitely would have wondered
what the heck it was, and I probably would have
thought I might not want to get too close to
that just in case. Yeah, sure, And imagine if you were, uh,

(48:14):
you know, somewhere in Canada, you know where it's it's
wide open farmland, and uh, you know, there's it's it's
a mile between houses, and this thing is propped up
on its it's its legs and its tripods seat there
out in the middle of nowhere. I don't think I'd stopped.
And in that case, I probably would stop, only because
I would think, who the heck would set up something

(48:35):
sinister in the middle of nowhere where you are not
likely to affect much of anything at all, And that's
how they get you. Whereas I would be more concerned
about the the city location where the the opportunity is higher.
Ye yep, I just see it as like, and that
was the last thing that he thought. Well, I already

(48:56):
told you that I was worried that my obituary from
yesterday was going to say choked to death on twizzlers.
So it's actually much worse. Yeah, no, especially since I
hate twizzlers all right. Well, at any rate, this was
really a lot of fun to talk about, and it
was fun to kind of, you know, think about the
weird adventures, which are all, like I said, documented. You
can go to the hitchbot website and read up on

(49:18):
the different days and events and things that it encountered
in the people it met. So many successful journeys and
so many events and things that had happened that happened,
and and it posted about all that stuff, and all
his interactions are recorded in some way, which is great,
so you can actually go back and relive those journeys.
And I do hope that we see some further experiments

(49:39):
that that are in the same spirit, whether or not
it's another hitchhiking thing or it's like the Philly love butt,
or I know, it's like it's like I'm ten years old.
I can't you can't say that, and I still giggle
every time. But you know, I see what you mean.
I I anticipate a bunch of copycat hitchpots popping up.
Um here's another little thing that we didn't get to

(50:02):
really talk about this, but I think the very next
thing we're going to see out of this unfortunately, and
this is just my gut feeling. You said, somebody probably
scrapped the head, you know, the control unit. I have
a feeling we're going to see a photograph of that
show up somewhere that's going to be sent to the
creators of the of a hitch pot, which to the
creators probably would just be yet another data point. I probably, yeah,

(50:24):
But I I see it as going like the way
a real crime against a human would have gone. Um,
you know, and that there will be next a kind
of a taunting note sent to them as well, which
you know, I hope I'm wrong that if that does happen, though,
it is interesting because it's just that whoever does it
obviously would think of it as uh or at least

(50:46):
I would guess this is armchair psychology. But I would
imagine that they would think of it as like a joke.
That you know, I'm treating this as if it were
a real person. Meanwhile, there are other people who think
that's sick because they do think of hitch Pot as
at least some in some way similar to a person.
And this is again here I read a lot of

(51:06):
true crime, so it's not that I'm just thinking about
this all the time. I mean, I'm just saying around
the one year anniversary, just pay attention to what's going
on the news. It might happen, might not happen. I
don't have any inside info or anything like that. I
could just as easily be that someone saw it and thought,
oh I want a tablet. PC could be, and it
could It could have ended up in a dumpster half
a block away. Yeah, yeah, it's it's hard to say,

(51:27):
but this was a lot of fun. Scott, thank you
for coming on the show. Thank you again. I appreciate it,
and likewise I had a good time doing it. I
hope you enjoyed that classic episode of tech stuff, The
Sad Tale of Hitchmot This is why we cannot have
nice things. If you'd like to get in touch with me,
let me know what you would like me to talk about.
Maybe give feedback on the show. Be gentle, I'm a

(51:50):
delicate flower. You can do so in a couple of
different ways. One way is to download the I heart
Radio app. It's free to downloads free to use. You
just navigate over to tech Stuff. There's a little microphone
icon there. If you click on that, you can leave
a voice message up to thirty seconds and length. Otherwise,
if you prefer, you can reach out on Twitter. The
handle for the show is text Stuff hs W and
I'll talk to you again and really soon. Yeah. Text

(52:18):
Stuff is an I heart Radio production. For more podcasts
from I Heart Radio, visit the i Heart Radio app,
Apple Podcasts, or wherever you listen to your favorite shows.

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.