Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to Tech Stuff, a production from I Heart Radio.
Hey there, and Welcome to tech Stuff. I'm your host,
Jonathan Strickland. I'm an executive producer with I Heart Radio.
And how the teg are you? Yeah, I'm still sick.
I'm working on it. It's a process, but I don't
(00:26):
want to leave you without an episode, and I also
don't want to subject you to having to listen to
me sound like this for any real length of time.
So I thought I would bring you an episode from
the show Forward Thinking that we did several years ago.
And by we uh, the podcast was was hosted by myself,
Lauren Vogelbaum, whom you might know from Savor as well
(00:49):
as lots of other stuff, and uh Joe McCormick, who
you might know from Stuff to Blow your Mind and
other stuff. So Lauren and Joe and I did episodes
about futuristic stuff, not always technology, but often technology, and
this particular episode is technology related. The title of the
episode is your Body as a Computer Interface. I hope
(01:13):
you enjoy Welcome to Forward Thinking. Hey there, and welcome
to Forward Thinking, the podcast that looks at the future
and says I sing the body electric. I'm Jonathan Strickland,
(01:34):
I'm Lauren Vocal and I'm Joe McCormick, and uh, guys,
this just didn't did did did? Did? Did? Did? Did? Did?
I'm sorry, it's porky pick Lauren similar no, no, no
breaking news. We got a tweet on Twitter from one
Chris Newcomb who said, in case your listeners didn't know
(01:55):
concrete ships, and he provided a link and and said, hey, kids,
swing around this thing for fun. And the link is
to a wiki entry for the S S. Palo Alto,
which is a concrete tanker ship from World War World
War One. Why are we bringing this up? Is it
because in a recent episode we made jokes about concrete ships. Yes,
(02:18):
that's exactly what we did. In our Building Materials episode,
which came out a couple of weeks ago or so,
we all had a really good lull about how concrete
ships are completely ludicrous idea that no one would ever
possibly put into use. I think specifically what happened was
we were talking about what you should make out of concrete,
and I said, obviously boats. Yeah, I had no idea
(02:40):
that this was a real thing. In my defense, because
I was the person who immediately struck that down. This
ship was retired like in ninety nine and then it
broke in half. So but there are totally ships that
are made to this day with pharaoh cement, which is
a type of concrete that's reinforced with mession rebar and
(03:01):
stuff like that. Uh so, so that happens. And I
just wanted to wanted to update you guys important news updates. Well,
thank you for the info, Chris. I I feel much
more educated than I was before. And I also now
feel like I have a magic power to make unlikely
seeming things real by laughing at them. So what if
there was a boat made out of uranium? You know what.
(03:26):
As as funny as that is, and as much as
I would love to dwell on it, I I did.
I did a little bit of research on a totally
different topic. Uh yeah. Yeah. First of all, thanks again
Chris for sending that message. And we we really love
hearing from you guys, and it was awesome learning something
that we thought was ridiculous. Uh. Today we're gonna talk
about something else that at times comes across as a
(03:47):
little ridiculous, which is the idea of one is the
future of the kind of the interface that we use
between us and our technology. Specifically, what happens when we
turn ourselves into that interface? Yes, but in your face
is what happens. I guess that kind of is I
(04:08):
wanted to do you guys have any like really good
terrible interface stories? Well, I mean, just here, did you
have a favorite terrible interface? To me, it's not so
much a terrible interface as it was a frustrating process
of learning how to use it. I'm left handed, and
I'm old enough to remember personal computers before the mouse,
(04:30):
and when the mouse came along, and then suddenly the
mouse was absolutely required in order for you to be
able to to navigate through software, it became very difficult
for me. Like other folks were picking up on it
very quickly, for me, it was more work because I
didn't have that kind of fine tuned precision with my
right hand the way I do with my left hand,
(04:51):
and they didn't have left handed mice, and and by
the time they did, I was so used to using
my right hand for that I couldn't use a mouse
left hand it. I mean, I it felt wrong and
and nonintuitive because I had trained myself how to use
it with my right hand. But for me, that interface
initially at any rate, was not great. But that was
(05:14):
again due to my own personal uh, left handedness, my
sinister state of being. You know, I think you've probably
had experiences with other maybe awkward interfaces. What about Google Glass? Yeah,
Google Glass. Actually I was all right with Google Glass
because I knew going into it what the interface was
(05:34):
going to be like. But in general, actually Google is
a great example. Just in general, Google, uh is a
company that is clearly populated by engineers who have a
very specific idea of how to solve a problem, which
may or may not translate to any other human beings experience.
And so when you use a Google product, you have
(05:57):
to kind of learn how to think in the way
the engineers were thinking when they designed it, and then
it works. But initially it can be very difficult to
kind of suss out what you're supposed to do or
how you're supposed to get your technology to do the
thing you wanted to do. Yeah, we'll show. I mean,
there's a learning curve to any kind of interface that
you use to to to use technology. I'm still completely
(06:19):
mystified by the N sixty four controller designed for someone
with three hands. I just speak about it. It just
makes me frustrated. It just makes me feel like I'm
about to lose and get a turtle shell, right, or
the old or the very old Microsoft Xbox Duke controller
which was roughly the size of a Volkswagen Beetle like
(06:44):
some of those, Or you'll think about some of the
really frustrating interfaces that were meant to be groundbreaking, like
the power glove. I mean, I don't know if you
guys might be too young to appreciate what a terrible,
terrible product that was. I never I never owned one.
I don't think I even knew anyone who had. I
played with one, and it was you would rapidly go
(07:08):
from this looks so cool on me too, This is
completely useless if I want to actually play a game. Um. Yeah,
there are a lot of examples of interfaces out there
that got in the way, and really what we're trying
trying to talk about today is the attempt to get
the interface out of the way to have a seamless
(07:28):
interaction between us and the technology that we depend upon
and want to use. Yeah, okay, so I guess we
should do a quick bit of definition just in case
you haven't been following us. So much so far you
probably have. But just to be clear, what is an interface.
It's a system that controls in putting output. It's the
interaction between a human and a computer. And so this
(07:50):
can be hardware or software. The hardware would be things
like a screen where you see what's coming out of
the computer you can know what the computer is doing
for you. And hardware input like a mouse or keyboard. Uh.
And sometimes you can combine input and output into a
single device, for example a touchscreen. Classic example inputting output
in the same object. But interfaces can also be software.
(08:14):
So the classic example is the graphical user interface. Like
when you go and use Microsoft Windows, you're interacting with
memory in the computer through pictures. You know, you move
files around into different folders. Really that's data coded to
different locations in the memory, but there's a graphical representation
of it for to make it easier for you. You're
(08:36):
also executing programs that are represented as a some sort
of graphic uh. You know. Of course, if you were
using computers before goo ease became a big thing, then
you were typing all those commands out and you had
to remember what all the different commands were in order
for you to do things like navigate to the right
directory file directory to execute a file. But the the
(09:00):
graphic user interface ends up simplifying that by creating this
visually oriented approach to interfacing with the computer. Yeah, and
so if you look at the history of computing, it's
clear that interfaces are always changing, but not necessarily at
a constant rate, not like at the rate that processing
power seems to steadily multiply over time. Right. Yeah, this
(09:22):
is another great example. We've talked about this when we
mentioned Kurt's Wile and Kurt's Wild's look at things like
Moore's law, and you start to try and uh and
and draw conclusions about what the future is going to
be like when by using Moore's law as your starting point.
But that is really deceptive because, like you're saying, Joe,
not everything and not everything progresses at that same speed, right, Well,
(09:46):
and and that speed in particular is a little bit ut.
What's the word self Oh, there's self fulfilling. Yeah, it's
a self fulfilling prophecy, because no engineer wants to be
in the generation that let Moore's law die, you know, Like, no,
We've got to figure out another way to double the
processing power of this computer within the next eight to
twenty four months. Yeah, but but no one has figured
(10:07):
out how to double the mouse. Right, Hey guys, I
came up with it. I taped a mouse on top
of another mouse. It's double mouse. Doesn't really work that way.
So early computers didn't have a monitor or display. Right,
You would get your output in some other format, Like
it could be punch cards. You could get a series
of punch cards. You've run your your program through. Your
(10:29):
input was punch cards. Your output is a difference set
of punch cards, and that's your compiled program. That kind
of stuff. Or you might get like a printer that
prints along along uh strip of paper like it like
tape essentially is what it ends up looking like. Actually,
a friend of mine, Richard Garriott, he's known as he's
(10:50):
a game designer. His first game that he ever designed
printed on tape like that. So every move you made
in his little dungeon based crawler, it would print out
what that looked like. It was just a very simplified
representation of a dungeon, and when you would make a move,
it would have to print another sheet out to show
(11:10):
you what had happened. The original idea of the interface
of the universal touring machine was a strip of paper
and which in which calculations would be done one at
a time, printed on a long strip of paper. Yeah.
So eventually we of all beyond that. We got monitors
and displays. Uh, we got keyboards, which were much easier
(11:31):
to use as an interface with a computer as opposed
to just a collection of punch cards. Later on we
got the mouse. UM. Xerox ended up Xerox's Park department
figured out the mouse and the graphics user interface, although
of course it was Apple that took great advantage of
that with the McIntosh UM, and so that became like
(11:51):
the standard interface for computers and the mouse and keyboard
for a really long time. Was that was input? Yeah,
I mean you would you had some other like fringe
input systems, things like light pens and stuff. But yeah,
but the enjoy besides the joystick, the general public really
didn't interact dance revolution pad. Yeah. Now these days, we
(12:15):
we've gotten to a point where we're seeing a real
revolution interfaces revolution a regular right. It's weird that we
think of that. We think of like keyboards and mice, mice, mouses,
whichever it is. UM this is standard, the standard way
to control input right now. Screens are still pretty much paramount,
but i'd say that, you know, I bet the majority
(12:38):
of our web traffic comes from touch screen devices, not
mouse and keyboard driven devices. That's absolutely correct, and you
can see it by looking at the metrics. Um, yeah,
we're getting tons of traffic through mobile, and mobile in
general is using touch screens. Now, you could also use
voice commands for a lot of that mobile stuff if
you were so inclined, just your commands to say, okay, Google,
(13:01):
how stuff works? Dot Com. I apologize. If you're listening
to this on an Android device and you have the
speaker active, oh, you could do GEST your control through
an Xbox connect or something like that, and they're the
leap controller for PCs. There are a lot of the
examples of that. We'll talk a little bit more about
gest your controls in a bit. But but once you
(13:22):
get into gest your control, you're entering a new kind
of territory, aren't you, Because they're the issue is uh, well,
think about it like this. You're you're using the body
itself as part of an interface with a machine. So
using the bodies an interface, it's an interesting kind of
counterintuitive idea, since an interface is supposed to be the
(13:44):
bridge between the computer and the power of the computer
and you the user, what you're getting out of it.
But if you think about it, we've been bridging this
gap for so long with physical devices that are connected
to the computer. Why not flip the script and build
a bridge in a place that's physically connected to the user. Right,
(14:05):
And we've seen some some examples of that as well.
You could argue that VR headsets are are getting into
that where you are are wearing the computer device and
your physical motions are what allow you to experience that
computer power in the way it was intended. Sure, or
(14:26):
the aforementioned connect sensors, you're you're using your body to interact, Yeah, exactly.
We'll be back with more of an episode about your
body as a computer interface after these messages. So we're
(14:46):
gonna talk about not just ways where you are interacting
in order to create input into a device, but also
how you experience the output from that device, right, Yeah,
So yeah, we should mention some basic ideas in what
what what's on offer here in terms of the body
is an interface, so the input is probably the more
(15:09):
obvious one with input. Instead of pressing buttons on external device,
moving a mouse or whatever, you might simply perform an
independent action by and of your own body. So this
could be gestures, movements, poking yourself, wiggling a part of
your body, wiggling multiple parts of your body, other people's bodies. No, see,
(15:32):
then it wouldn't be it wouldn't. That wouldn't quite meet
the criteria. I think, well, it's still be a body interface,
it just wouldn't be your body should have a dead body.
I should have a Bernie device that reminds me of
a movie that's coming out the go ahead. I think
a complicated system of shrugs would be really good, right, yeah, right,
so you never have to lose your cool while you're
(15:53):
controlling your phone. I meant to. I meant to do
the the please tell me more shrug, but it turned
out just the shrug, or a combination of side eye
and shrug. Anyway, the key ideas that you don't have
to touch or manipulate anything except your own body. You've
got it with you all the time. So there you go.
And it's not this this idea isn't super weird to
(16:15):
us because we're already familiar with stuff like the connect
and gest your controls to some extent, then if they're
not super awesome yet, but then you've also got output.
So instead of having to look at a screen that's
attached to an external device, which would cover the vast
majority of output today, I'm having trouble even thinking of
a standard device that has a very different way of
(16:37):
doing it than this. Well, I mean there's a growing
list of devices that now are using audio. Yeah, so
things like Amazon's Echo, right, that that would be you're
not Yeah, that's a good example, but but I get
what you're saying. Yes, the vast majority, I would say,
are are visually oriented. So yeah, instead of that, information
about your computing task is made available directly to your body.
(17:00):
The classic sci fi example of this would be this
internal retina display. The screen is projected directly onto your retina.
But I mean, that's a little crazy. We probably won't
have anything like that for a long time. If ever.
Now we have some other examples that we'll talk about
when we get to a haptics because that's one of
the other methods of getting feedback from a computer that
(17:20):
can be meaningful. It's just not visual. Yeah, So I
would say the middle step between standard inputting output devices,
standard interfaces and this your own body as an interface
would be wearables because that's, uh, it's getting close to
your own body, but it's not quite your own body yet. Yeah,
it could be something that is unobtrusive where you might
(17:43):
not even think about it consciously after a while after
wearing it. Yeah, share like a like a fitbed or
the pieces of jewelry that they're coming out with something
like that. Absolutely, that was a great example, and fitbed
is in fact a fantastic example because so much of
that interface is invisible to you. That again it's the
idea of removing that barrier. So a fitbit is going
(18:04):
to be tracking your steps. You might have other wearables
that are doing things like tracking your heart rate, uh,
that sort of stuff. Where the data is going to
some form of cloud based solution or or potentially just
being beamed to a local uh device like a smartphone
or tablet. You're still usually dependent upon that other device
(18:26):
in order to be able to consume the data in
some way. It's presented to you in some way, so
it's not like it's directly getting that data to you
through the device itself. You're you might have a display
on the device that gives you some of the basic information,
but your body is not acting as the display, right,
it's still the device. Um. It does allow us to
(18:48):
have other means of interacting with our technology. Some wearables,
you could argue, like the Nintendo Power Glove was a wearable,
particularly good one. But it was a wearable. Uh. It
was not not obtrusive, not you know, it was definitely
you knew someone was wearing one. But it allowed well
that was part of the point, right, You wanted to
(19:09):
show off that you were playing with power Yeah, exactly,
And you had a glove to yea, and you looked
a little bit like Michael Jackson in that air a
little bit, yeah, which at the time was really important. Yeah,
because if we couldn't get the jacket, we could at
least get the power glove. But the the you all
remember the Nightmare on elm Street movie where Freddie wears
(19:30):
a power glove. I remember seeing it, don't from my mind,
I think it's okay, Oh god, I was so worried
about myself for a second there. Okay, now everything's gonna
be fine, you guys, it's late at six though I
think that probably are on four. But at any rate,
I get what you're saying. Um. Yeah, So, so one
(19:51):
of the things I wanted to talk about this is
kind of a this is looking ahead at wearables. So
you could have wearables that are are something that you
don't interface with directly at all. It could be you know,
we've we've talked about the possibility of things like r
F I D chips to have a profile. This is
sort the sort of thing that Bill Gates was putting
in his house where you would get a little r
(20:12):
F I D D badge that would have a profile
programmed into it that's personalized to you, and then your
experience as you walk through his house would be to
see the kind of art that you like, to hear,
the kind of music you like. The lighting condition would
be to your preference. Uh, it would be dependent upon
that profile. We've since reached a point where we can
(20:32):
get a little more advanced than that. You don't necessarily
have to have a wearable anymore for that kind of stuff.
But that was one implementation. Another, however, is this, uh,
this this haptics feedback solution that I had been talking about,
so being having to do with your sense of touch exactly.
So we already are familiar with technology that has haptic feedback.
(20:55):
Here's one you're very likely to be familiar with. If
you ever played a video game or the control roll
or buzzes in your hand you do something, right, that's
the old rumble pack, right. Those have been around for
for a couple of generations of video game consoles and
also for PC controllers, And generally speaking, you you want
to have a controller that enhances the experience of playing
(21:18):
a game, so it might be a great example would
be a stealth based game where you're you're skulking around
in the shadows and your controller might start to vibrate
to indicate that perhaps you are visible or potentially visible
to an enemy, so you need to get back into cover.
Yeah yeah, vibration being being like your spidy sense going on.
Yeah um. Or if you're if you're playing a horror
(21:40):
game and your your life starts getting too low and
you start feeling your heartbeat through the controller and that
lets you know that you need to drink a health
potion or you know, to spell whatever it is right,
or unplug the game and going to the corner. Yeah. Uh,
if pyramid head is coming around the corner. You just
want to like, I just I just need to go
and read a happy book for a while. Uh. Or
(22:02):
you know. Another one that everyone is probably familiar with
are the vibrating motors in uh and and smartphones or
cell phones, right, And that's a very simple Yeah. Yeah.
They maybe they vibrate when you're doing an input, or
they vibrate to alert you that a call is coming in.
But that's that's the haptic feedback, very basic application of
haptic feedback. I don't know if you've ever had the
(22:24):
experience of getting so used to this that it's weird
when it doesn't happen. I've had this experience with my
phone where I don't even notice anymore that when I
punch a key to like enter, you know, to enter
a letter on a text message or something like that,
the phone vibrates a little to register like, yeah, I
got that, I got that letter. Uh. And if the
it's on very low battery and you go into energy
(22:45):
saver mode, it'll stop vibrating when you enter keys, and
it feels very weird. You keep wondering like, wait, did
that keep did that press? Did it take? In my
old phone? I actually turned that off because the the
it was reminding me of that episode of The I T.
Crowd where they soup up Roy's vibrating motor in his
cell phone, because every time I would type in a letter,
(23:07):
it made a really loud, like eat noise, to the
point where if I wanted to check something out and
my wife wasn't napping, I would wake her up. So
I finally turned it off. My current phone much more subtle,
so I'm alright with it. I've never had a phone
that did that. Although it did it did scare me
every time I was I used to I used to
have a fitbit that I would wear constantly, and whenever
(23:28):
I went over ten thousand steps in the day, it
would it would do this like buzz buzz buzz. Yeah,
party times and You're thinking, I'm having a heart attack.
Make what's going on? Sometimes sometimes the acclamation period takes
a little longer for certain types of technology than others. Well,
one of the things I wanted to talk about, and
(23:48):
in fact, this was one of the stories that kind
of prompted the this entire episode, was this research project
some people at the University of Sussex with funding from
a couple of different companies and organizations have developed something
they call skin haptics, and this is a device that
gives haptic feedback but does so without a moving, vibrating
(24:13):
type of motor having direct contact with your skin. It
actually does it through ultrasonic frequencies. Crazy, right, So I
did a how Stuff Works Now piece on this How
stuff Works Now for those who do not know. That's
where we at how stuff works post stories that are
happening right now, the kind of newsy type of stuff,
(24:34):
and it tends to be science and technology focus, but
not necessarily. We've done some other things that are outside
of the realm of that, but I tend to focus
on science and technology. A lot of them do since
it's a since it's research that's just coming out right now,
and so so a lot of that has to do
with technology and science. Yeah. So this particular one, they're
(24:54):
using ultrasonic waves that you have a little emitter that
would go on the back of your hand. So you
would put this emitter down and it would be facing
down towards your palm, you know, through your hand, and
it would admit ultrasonic frequencies. They would move through your
hand through your very flesh and bones and concentrate on
points on the other side so that it would feel
(25:17):
as if something was making contact with the palm of
your hand with like with like focal points on your Okay, okay, sure,
so imagine that. Imagine that you have a screen projected
on your hand and it's a number pad, and when
you press, not only do you get the feeling of
where your finger touches your hand, you then get a
(25:37):
confirmation through this ultrasonic frequency saying yeah, yeah, you totally
you totally touch the one that butt. Or you're playing
let's say, you know, you're we're really looking in the
future here. You're playing angry Birds on your hand and
you pull your little your little bird back and you
let go and the bird collides with a pig, and
then your your hand vibrates right at the point where
(25:58):
the bird and pig. I did that kind of stuff. Um,
that's the idea behind it. Now, this is just the
haptic feedback part, not the display. Your beautiful fire future
definitely still includes angry I don't I can't imagine a
future without it, and I don't want to. Uh, but
I was thinking that this this same technology could be
used for stuff where you don't have a display element
(26:21):
at all. So imagine, if you will, a a bicycle
or a car that has sensors on it that can
detect potential obstacles, dangerous collisions, that kind of thing, and
you're you're using this device and it alerts you by
creating pressure on your hand through this mirror, and perhaps
(26:42):
is even telling you almost radar like where that threat
could be coming from, so that you have the opportunity
to react and avoid it. In other words, you have
spidy sense. You could even do this on a personal
level if you don't mind wearing a super dorky helmet
that's got sensors all over it. Mean, I can't really
think of any implementation where you would be able to
(27:04):
have sensors mounted in such a way that you didn't
look completely weird, as long as it's modeled after visually
modeled after the calendar that Rick Moranna swears and Ghostbusters, right, Uh,
you know the as the the was he was he
the key master, was he the gate keeper? He was
(27:24):
the key key master. Yeah, he's Vince Claro, key master
of Gozer. Right. Yeah, So, uh, if you for that
important research, if you want to tell him about the Twinkie.
So if you want the if you want the key
Master look and you don't mind it and you want
to have spiky sense, then we could probably work something
up once the University of Sussex guys get this completely
(27:45):
ready to go, like ready for prime time. Yeah, but
it's a really cool idea again of using that touch
feedback where the body itself becomes the the method for
you to uh to experience this technology, and you've stripped
away anything else, like you don't have a screen or
anything that you're dependent upon. You're just feeling this. Yeah. Yeah,
(28:07):
there's there's no way to to leave the device behind
because it's attached to you. Yeah. Now we could go
the next step, which is where our largest organ, the skin,
becomes an interactive surface, something that you use to interface
with technology like the touch screen on your phone. And
so today, if you want to read or send a
(28:28):
text message, you'll usually hold your phone in your hand
and read the text off the screen or type on
the screen by pressing letter buttons with your fingers. I've
actually slowly started to move to voice to text, but
that obviously only works in special situations where you know
(28:48):
it's really annoying when you keep doing that in meetings,
I don't. Yeah, I can't do that anymore in in
meetings or generally in public. But like if I'm on
my own, if I'm walking my dog and I need
to sten a quick text to my wife, I often
will use voice to text um, and then people just
think that I'm married to my dog, which because they
(29:08):
really because because of the messages. Hey honey, do you
mind picking something up on your way home? And I
don't think that dog is gonna be at all cooperative.
It's great when you hear people saying I love you,
babe to a lamp post. Look, don't judge. Okay, you
don't know their love. I saw the brave little toaster,
and that lamp was adorable at any rate. Yeah. So so,
(29:32):
so we've got a device in the way here. Before
we can get to Jonathan's uh perfect Angry Bird's future,
we need to take away that screen, that physical device.
So imagine that same experience sitting there with your phone
in your hand, type in a text message on it
or reading a text message off of it, but without
(29:52):
the phone there. And there's where you get this concept.
One implementation I saw of this was a thing. It
was from the Hasso plotn Er Institute called the Imaginary Phone,
which was a project by Sean Gustafson, Christian Holtz and
Patrick Bodish and Imaginary Phone. It sounds great on one
hand because it's like an April fool's Prankly you give
(30:14):
somebody a gift, it's oh, it's your new iPhone. It's
an imaginary phone. You we think you're yeah. But so
the last I heard of this project was some media
coverage in two thousand eleven. I don't I don't know
if it's still in the works, but the idea at
least is interesting enough that you could continue it to
(30:34):
be adapted with new hardware. And here's the basic way
it works. And you wear a depth sensitive camera on
your chest um, and then you hold out your hand
in front of you, palm side up, and then you
interact with your own hand the same way you'd interact
with the phone screen. You know, you press parts of it,
and the camera tracks your movements and sends them to
(30:56):
your phone or other device as input commands. So different
place is on your hand correspond to different inputs on
a phone screen. To swipe across your fingers will swipe
the screen. You can dial numbers by pressing different parts
of your palm um. And that's interesting in terms of input,
but obviously if you're just looking at your hand, that
sounds like a kind of annoying thing to learn to do. Sure. Yeah,
(31:18):
I have a really hard time typing texts out when
I can see the letters on the screen, So I'm
picturing that that wouldn't go very well for me. Yeah,
and so what about the You basically need a corresponding display. Uh.
The appeal of the touch screen is that you you
can see what you're interacting with and perform your interactions
in the same place. And that that brings us to
(31:39):
this idea of projected displays on skin, which inherently isn't
there's no real problem with that except that it just
requires some good, non invasive, comfortable design and some good engineering.
I guess we've got more to say about using your
body as a computer interface after this. So one project
(32:09):
I came across that was trying to do something like
this was the Secret Bracelet Projects Secret c I c
R E tn't. Maybe you're supposed to keep it secret,
keep it safe. Anyway, there was some coverage of this
project back and the idea is it's a bracelet. It
(32:31):
looks kind of like a jawbone up, you know, it's
a bracelet that has a low angle projector on it
that projects a display onto the surface of your forearms.
So you wear it on your wrist and it projects
at a low angle up onto your skin on your arm.
And if you go watch the the original video promo
for the project, you will see animations that are I
(32:52):
mean obviously that they are animations, because it was before
the thing was made, just trying to say, here's the
concept type of thing, probably just a concept, here's the concept.
And you could see that people were mad that it
was it was making it look like these displays would
be incredibly sharp and beautiful looking. Um, and that's probably
(33:12):
not going to be the case, especially early on, and
they might have been optimistic. But now there are prototypes.
I've seen videos of them being used, and the image
quality in the real world it might not be amazing,
but also it's not too bad. And that also might
not be that big of a deal, because what if
you're not really planning on watching movies on your arm,
(33:33):
but what if you just wanted to be able to
use your arm to send a text message or something
without having to mess with a device, get a device
out of your pocket and deal with it. Um. Yeah,
it's an interesting idea. It's one of those where a
lot of the ones we're talking about now you still
have to pair it with right. So then the question
is is the benefit of whatever this body interface is
(33:57):
that benefit greater than the frustration you might have of
just taking your phone out and doing these things. One
of the things I was first thinking about when you're
talking about typing stuff on your palm with the earlier
implementation was well, what if you wanted to make a
phone call, you would still have to get your phone out.
And then I thought, wait, no, you're making a rookie mistake, Jonathan.
(34:17):
No one uses their phones to make phone calls anymore.
So I don't even know why that crossed my mind,
because I'm a dinosaur, is really what I'm getting at.
But another I haven't had a good Jobathan Age joke
in a really long time. Well, we haven't. We haven't
ever had a good Jonathan Age. I think I think
Jonathan made some get off my lawn jokes just last week. Okay,
(34:38):
to be fair. I also talked about acid rain in
the episode we recorded immediately before this, and talked about
growing up the eighties and being afraid of communists, so, which,
by the way, was a product of the time. Um So,
one of the other things I want to talk about
about bodies and interfaces doesn't have to do with displays.
We talked about the haptic feedback, We talked about visual feedback.
(34:59):
What about audible feedback? Using your body to make sounds
and not in like a middle school kind of way
where you're you're showing your friends how hilarious you are
by utilizing your body to make various fart noises. That's
not what I'm talking about, sure, and I mean and
I mean we're we're making sounds with our bodies right
this very much. But what if we could it's a
(35:20):
relatively easy system. What if we could make that that
whole process more convoluted and creepy like instead of instead
of just talking to you like a human being, I
you and I would still have to be in the
same space for this to work, But then we make
it super creepy. And by creepy, I mean creepy. Well,
(35:41):
I'll explain what I'm talking about, and you tell me
if I'm off base calling this creepy. The system I'm
talking about is called Insian denshion um and this was
a a concept out of Disney research actually, and it
is all about turning your body into a transmitter um
and it's pretty well key. So think of one person
(36:02):
being the transmitter. This would be someone who's like the speaker,
as in the person speaking, not not a speaker in
the electronics sense, and the other, yeah, the other person
is the receiver or the listener, and together you end
up creating a speaker in the sense of electronics. So
(36:23):
the transmitter person speaks into a microphone, and presumably they're
saying something quiet enough so that the other person isn't
hearing it, so they might be whispering something like a
like a I want to go to lunch today, and
you whisper it into your microphone. The microphone transmits that
that sound wave. It actually transforms it into a high frequency,
(36:46):
low power electrical signal which your body can carry because
our bodies actually can conduct electricity. So then yeah, yeah,
too well in some cases, which is you know, that's
you've got to be real careful around electricity. But one
of the many reasons you should be. But let's say,
all right, you've got this microphone, you've just whispered something
(37:08):
into it. It's now transmitting a low power, high frequency
electrical signal through your body. You then walk up to
the person who is the receiver, and then you just
casually reach out and put your finger on that person's ear.
This is the part where I think it's kind of creepy.
And then the connection that you make with that other
person connection, that's what allows this area to suddenly become
(37:32):
like a speaker, and the person whose ear you're touching
will be able to hear the thing you've whispered into
the microphone. Uh So, it's also interesting in that you
can extend this by having multiple people involved. It's almost
like playing a game of telephone that you could whisper
something into the microphone, put your hand on someone's shoulder,
(37:52):
they put their hand on the next person's shoulder, and
the next person, and then that person puts their finger
to the recipient's ear. They would hear what you had
originally whispered into the microphone. I don't know that there
is any practical application of this technology. It's just another
weird way of turning the body into an actual interface.
You know, I do try to keep an open mind,
(38:13):
but but burn all this with fire. I don't really
like it when people touch my ears. Yeah, I I like.
I like listening to a s MR videos in which
there is an ear massage element going on. But the
actual thought of someone physically coming up and touching my
ears squeegs me out a little. I feel like that's
(38:33):
a violation of personal space that I'm not ready to
deal with casually. That's something that should be left up
to U two con alright with exactly this is CLF
of five. I mean I could I could see it
being used in uh, in some other way of transmitting information. Um, well,
(38:54):
it could be an interesting way of being able to
transmit information quietly, but you still have to speak in
to the microphone the first time. Well, I mean, if
it wasn't using if it wasn't using sound waves, if
if something, if if you could hook yourself up to
some kind of device that would read that not uh,
not literally out loud in your ear, but that would
that would read the information and be able to I
(39:14):
don't know, uh, yeah, I don't know. You couldn't because
you couldn't do it to yourself because you wouldn't be
creating a full connection. You have to be touching somebody else.
So I mean, I mean some something like like, oh,
here's my business card handshape. I could see some game
elements in there too, But again it's a little little
(39:38):
wacky and little weird, but it was an interesting way
of looking at turning the body into an interface if
you wanted to go the next step, Like let's say
we've gotten past the wearable stage. I think a lot
of people are imagining the body as interface with embedded technology,
stuff that would be incorporated into us if you're really
(40:00):
desirous of surgery. Yeah. Uh And according to Repo the
Genetic Opera, there is totally a group that's in there. Um.
But yeah, this would be where you would actually have
some sort of technology is embedded in you. And obviously,
at this point we're talking pure speculation. There's not examples
of this beyond stuff that bio hackers are doing, and
(40:22):
even in that case, they tend to be incredibly primitive
applications of technology. Uh. And it may be that we
won't see this kind of stuff for quite some time.
For lots of reasons. Now, we do have examples of
embeddable technology biotechnology generally speaking, though they are designed to
address a problem. Like let's say that someone wishes to
(40:45):
they have a visual impairment that they want to overcome,
that there might be technology they use in order to
supplement their eyesight or their hearing with like cochlear implants,
that kind of thing. Now we've got examples of that,
while we don't really have our examples of attempts to
enhance already quote quote normal or within the norm kind
(41:09):
of of human capabilities. And I hate using that phrase,
but that's the way it tends to be frame. I mean, well,
and medically speaking, there is there is, there is an
average or normal, and it's and it's not meant to
but absolutely yeah. And and the and of course that
the problem there is is that any responsible medical person
isn't going to recommend I mean, I mean, surgery is
(41:30):
always dangerous. Getting an implant of any kind is always
going to carry an element of of You could get
an infection and can go terribly wrong, like an your
body could could reject whatever it was that was implanted. Yeah, yeah,
it could scrip your immune system for that reason, lots
of things like that. So therefore, and we talked a
little bit about this in our Hacking Your Body episode
back when. Yeah, and there are other things obviously that
(41:51):
you have to keep in mind, things like the technology's
battery life. How do you recharge a battery? What is it?
Is it drawing power from the person in some way?
How can you create something that you make sure that
it'll it'll work within the body and not break down
or otherwise end up falling apart within a certain amount
(42:13):
of time because the body, I don't know, if you know,
this not the most hospitable environment for technology. Yeah, yeah,
it's there's there's that rest thing that we were talking
about oxidizing. Yeah, yeah, that's a that's an issue, right, Yeah,
you know, bad guacamole in the bloodstream. That's not something
we want to mess around with, right, And there are
there are lots of researchers who are working on ways
(42:35):
to to get that to be better. Yes, you know,
specifically for things like like a like a heart monitors
and and and whatever. But but it may be quite
some time even if we get to a point where
the technology is reliable, where it's safe, where where the
the potential for complications is as low as we can
(42:56):
possibly make it, there's still going to be a barrier
there where the medical profession in general may see it
an ethical issue of do I do I do this?
Uh cosmetic It would essentially be akin to cosmetic surgery,
but possibly with far greater UH ethical concerns than your
(43:17):
average cosmetic surgery. Is it ethical for me to perform this?
Am I going to risk my my livelihood if I
were to do this? And then you might eventually get
to a point where socially it's more accepted, but there's
probably gonna be a lag between when people are actively
(43:38):
advocating to get this done to themselves and when it
becomes socially acceptable in general. And and that period is
going to be interesting to watch and find out, like
how it'll it'll almost help determine how long it takes
to adopt that as uh a perfectly standard kind to practice. UM,
(44:01):
then there comes a question of hals versus have nots.
There are other conversations that happened further down that line
which fall more into that singularity conversation we've had multiple
times in this show. Now, the cool thing I think
is that you could argue a lot of the technologies
we're seeing right now are negating the need for surgery
in the first place, and it's largely through things like
(44:23):
machine learning, artificial intelligence, predictive technologies, very simple sensors working
on very complicated algorithms to respond to our needs in
a way where we we become unaware that our environments
are adjusting to us without our direct command. So a
(44:45):
very simple example of this would be something like a
Nest thermostat or some other smart thermostat, where it starts
to learn quote unquote what you like, what your preferences
are that maybe you like it pretty chilly at night,
but you like to wake up to a nice, warm,
toasty house, or or it's recognizing when you are home
versus when you are not home, and thus adjust the
(45:08):
temperature so that you are being uh, you're conserving energy
whenever you're not in the house, and you're not just
wasting electricity. Although that that is still through through the
pairing of a device. Yeah, it's pair well, it's pairing
a device through a WiFi network. You don't necessarily have
to have it paired to uh, like a smartphone or anything.
(45:28):
But the interesting thing is it is detecting when you
are there. It's detecting what you want, and it's responding
without you having to make a direct command, although there's
an acclamation period at the beginning where you are making
those commands. Otherwise it doesn't know. It's not like the
thermostat takes a look and like, uh, this bald guy,
(45:49):
he's gonna want its seventy degrees. I'm just gonna go ahead,
and no, I gotta tell it that I wanted seventy
degrees first. So I think what you're saying is we're
all going to get thermostats implanted in our bodies so
we can be seventy degrees on the inside. That is
not what I was suggesting, But it always is interesting
to get an insight into your thought process. Show Um. Yeah,
(46:10):
so but my point you think how much energy you'd
say if you didn't have to make the whole house
seventy degree just your own body. And my point being
that we are seeing some technologies come into play where
our bodies are in a way becoming an interface. But
it's through it's not through a conscious effort for us
to control that technology. The technology is responding to us.
(46:32):
And if we see that increase. In other ways, it
may turn out that the more invasive approach becomes moot.
We don't need it because we're able to compensate with
other technologies that are able to do these things through
machine learning. We got a bit more show to go.
But uh, before we get to that, let's take one
(46:54):
last break. Well when yeah, when you speak of the
ways our the ways technology is adapting and learning from
us without our knowledge, I mean I wonder about you know,
(47:14):
is Facebook going to get to the point where it
uses the camera on my computer to look at my
face and see what disgusts me and show me more
of that because I'm more likely to click on it. Yeah,
I mean there are certainly privacy concerns with this this
particular approach, and and we've already whether whether or not
(47:34):
it's a privacy concern. Let's say I authorize them to
do it. I mean, either way that they're responding to
your unconscious cues that you give with your face and
your eyes and everything like that. Yeah, I mean, that's
that's something that I'm sure there are people looking into.
I mean, but one the next thing we're going to
talk about where our micro expressions micro expressions being these
(47:55):
these very tiny gestures you can make sometimes unconsciously, and
how people are hoping to turn those micro expressions into
a way to interface with technology, largely because I think
most of us don't want to have to make big
gestures on our bodies in order to control technology. Would
be better to do very subtle things. Yeah, there has
(48:17):
to be some kind of happy medium between having to
make flag symaphore motions that you're connect and uh and
having a brain implant. That's that's directly reading please now, chicken, dance,
d answer this phone call right right. Yeah. There's so
many great comedy sketches that could be a result of
this conversation. But there was a piece that was written
(48:39):
in Fast Code Design. Uh Andy Goodman and Marco Reghetto
wrote there that that design company Feord Yes, yeah, yeah,
and uh and we'll come back to that towards the
end of it too. But it's they were arguing that
the motions we make with body interfaces should be minimalistic,
(49:00):
end that there's already precedent for this when you are
moving a mouse. This is one of those problems that
I had when I was learning how to use a
mouse with my right hand. Uh, the small motions you
make are translated into larger motions among on screen. Right,
So when you when you move the cursor on your screen,
you don't have to move the mouse the same distance
(49:21):
on your desk as what you're seeing on your screen.
At least you shouldn't. Right. If you are, there's a problem,
you need to change some settings. But generally speaking, like
you might move your mouse over an inch, but you
are moving the equivalent of like five inches on the screen. Well,
I mean, do you do you remember when the Nintendo
we came out and how long it took people not
(49:43):
that long to figure out that all these games that
were supposedly about like doing big motions and getting exercise,
you could actually just sit right there on the couch
and kind of flick the controller to accomplish the same thing. Yeah,
there was a specific flick of the wrist. There was
a game in Arcade. Game It was a boxing game
where the controllers were like two big boxing gloves, and
(50:04):
it could detect when it had motion sensors cameras essentially
mounted on the top of a frame looking down, so
we could tell when you were ducking or moving left
or right. And then the controllers actually had most most
accelerometers in them to tell when you were punching, but
turned out that if you just stood there and just
bang the gloves together, it counted it as a punch,
so you wouldn't get tired very quickly, and you could
(50:27):
just yeah, you could wipe out like the first five
or six guys on that on that yeah, just doing that.
I did not learn that until after I had actually
hurt my back playing an arcade game, and that's the
first time I ever felt old. Um. Alright, so anecdote. Yeah,
(50:47):
I mean, I've gotten to the point where I just
have to I have to own it, right, I just
got to own it. So they thought the writers of
this piece thought that those minimalistic kind of movements that
you would see with a mouse should be the same
sort of things you would see with a body interface,
so that you could do very subtle things to control
your devices, as opposed to doing things like swiping up
(51:09):
and down your forearm in order for you to just
say the volume of music being played on your device,
or the lights in your home dimming or getting brighter.
I think God. One of the examples they did is
about like putting your thumb on one of your your
tongue on one of your teeth. I have a quote,
so don't jump ahead. Yes, I apologize so specifically in
(51:32):
the section said the description can get a little creepy.
So this is this is a paragraph that Joe was
just referring to in this piece. It says, Uh, think
about this scenario. You see someone at a party you like.
His social profile is immediately projected onto your retina. Great
a match. By staring at him for two seconds, you
(51:53):
trigger a pairing protocol. He knows you want to pair
because you are now glowing slightly red in his retina screen.
Then you slide your tongue over to your left incisor
and press gently. This makes his left incisor tingle slightly.
He responds by touching it. The pairing protocol is completed.
(52:15):
Is horrifying. Yeah, the next piece I wanted to talk
about this is describing a party hook up in terms
of like the stuff you would do too. I don't know,
initiate trading in a massive I think what they were
looking at was they were like, you know, Tinder is great,
but not nearly creepy enough in your body. Yeah, we
(52:38):
we don't touch our teeth at all during tinder, right,
and we need to we need to we need to
really get some tongue to tooth action. Yeah, because otherwise
how will he know that you're interested in him? So No,
I'm confused about the part where it says his his
incisors starts to tingle. He responds by touching it. Does
that mean with his tongue or with another part of
(52:59):
his or with someone else's tongue? Who's to say? The
world is full of amazing possibilities. It's funny that you read.
I read this exact same piece and I did not
decide to include any of it because I found it
off putting. Well, the reason specifically I decided to keep
to to include it was because to kind of conclude
this conversation, there are people who are saying, do we
(53:22):
even want our bodies to be a technological interface in
the first place? Right? There was there's a really good
piece written more or less to actually in response to
to that piece, Yes, Technology Review, Right, that by John
Pavlis who wrote his piece was titled your body does
not want to be an interface, which pretty much tells
(53:42):
you what the argument is going to be. It's a
very well written piece, and it's it's very interesting. Um.
And he first argues that turning bodily experiences or motions
into a command issued to technology would make it feel
unnatural and alien, which is the opposite of the intention. Right.
The intent is to remove that barrier between you and
(54:05):
technology so that a natural motion gets interpreted as a
command and the technology response. But he says, if you're
doing this quote unquote natural motion in order to issue
a command, you're not really being natural by definition, because
you are you are issuing a command to technology, something
that is not a natural thing for us. It might
(54:25):
become something that ends up being second nature after doing
it enough times, but in itself it becomes this alien
task because now you're you're trying to do something in
order to make a command. I can easily understand what
he's saying, because if you've ever worked with a voice
command system and you have to issue commands in a
(54:46):
specific way in order to get results, it feels very
unnatural because you're having to preface what you say with
some sort of command, or you have to word it
in a specific way for it to understand what you're saying.
It is not a natural thing. Um. And so he
says that we would make ourselves kind of hyper aware
of how weird it is to like run our fingers
(55:07):
along the inside of our forearms to adjust the dimness
of the lights in our house or whatever. And Pablos
refers to a computer scientist named Paul Dorish or Dowrish,
who in turn took inspiration from a philosopher Martin Hidegger.
Hidegger Hidegger was a boozy beggar, according to Monty Python,
and uh, it was all about differentiating two general types
(55:29):
of tools. The first type was is called the ready
to hand technologies. Those are things that feel like they're
an extension of our bodies. Uh. And so think of
a hammer when you're hammering a nail, he says, that
would be a very very brute version, or in my
case for this weekend, the rapier. I've been working with
(55:50):
one for a while, so it feels like an extension
of my arm when I first picked it up. That
was not the case. It would fall into the second
category of tools, which, uh, you know that and is
the present at hand type, So ready to hand type,
that's the kind where as you're using it, you're not
even really thinking of the tool as a separate thing
from you. It's an extension of you, right, but the
(56:13):
present at hand, you are aware of the presence of
that tool. So, for example, when I was learning to
use a mouse, I was hyper aware of the mouse
because it was so hard for me to learn how
to use. It was something that I was absolutely conscious of.
It did not feel like an extension. These days, it
totally does because I've used it enough where I've reached
that level of familiarity, so things can change. Yeah. Yeah.
(56:36):
Anyone who's learned how to play a musical instrument, for example,
has probably gone through this. Yeah. That's a great, a
great way of putting it. So any of you out
there who think back to when you first started learning
how to play any kind of musical instrument, think about
your first day when someone first said, like, and this
is a chord and you went little nope, yeah yeah.
Or you're like, especially with stringed instruments, where like your
(56:57):
fingers start to hit other strings so you're muffling some
of the cored and you know that doesn't sound right
what is going on here? And you feel in that
first stage like you're never gonna get it right. It's
never gonna happen like you have to concentrate so hard.
But then eventually you start to develop a familiarity and
it becomes that first type of tool where it just
(57:19):
feels like an extension of yourself. Um. He says. The
problem is if we turned our bodies into interfaces, at
least for a while, they would turn into that second
type of tool, that our bodies would feel weird and
alien to us. He specifically took the the example of
John cusas character and being John Malkovich trying to control
(57:41):
John Malkovich. That's like a marionette dealing with a puppet,
and and it's not you're you're not skilled yet, you
don't know how to manipulate it properly. But instead of
it being another person's body, it's your own body. And
he says, that sounds like an awful experience. I don't
want that. Yeah. Yeah, it enters into this, and I
(58:01):
guess I was sort of emotionally reacting in this way
when I was thinking about some of those examples, the
tooth touching and whatever. Uh. It enters into this kind
of Cronenbergian sort of sort of body horror area where
you yourself are a foreign object. Yeah, and how do
how do you deal with that? Well, you reconcile that
(58:23):
the two and you figure that that these these interfaces
are ultimately going to be designed by somebody who thought
this has got to be the best way, this is
the way it makes sense to do this thing right,
But that's not you. So it may be that something
that seems natural to the person who designed the system
is completely unnatural to you. And then you have to
(58:43):
commit this unnatural motion in order to do something you
want to do. That's not a good experience. And yeah,
I I think there could totally be a Cronenberg style
body horror film based upon this premise. You know, even
if you're even if are doing something that isn't on
the surface horrifying, if you make it clear that it
(59:06):
is something that doesn't feel right in order for you
to get the response you want, that is a very
disturbing idea. Yeah, I wonder if this is the kind
of thing. There's been some research lately into common facial
expressions that that people across the world all make that.
There was one story that came out that I think
the aforementioned how stuff works now uh covered about a
(59:28):
nope face that apparently is just common to too many
human populations. This this like I'm disinterested and I don't
want to be interested kind of kind of facial expression.
And so I wonder whether further research into that kind
of thing could possibly identify common, common micro gestures that
(59:49):
are just ubiquitous that would be natural for people to use,
and that technology could pick up on without us even
really being having to be aware that we're making this
the and of course the challenge there also is that
how do you how do you determine which ones need
to be conscious decisions on the part of the person
interacting with technology, Because if it's an active thing, you
(01:00:12):
don't want to you don't want to accidentally activate your technology.
If all you need to do is like scratch your
nose or whatever, whatever tiny thing it might be, or
or that you're blinking at a certain frequency, whatever it
might be, you don't want to be activating your technology accidentally.
Like like I ended up taking a whole bunch of
pictures of myself when I was looking like a complete
(01:00:34):
dou fist because it just so happened that the thing
that I was going through at that time was the
same as my command, Hey take a picture now, so
there are some challenges. Yeah, I think I was thinking
like so far ahead, like into the like her kind
of university. I saw that film the idea where you
actually have this world that can anticipate things like like
(01:00:54):
the nest thermostatic model but on steroids. You know, this
idea of we we talked about this in our our
Internet of Things episode where he said, you know, you
extend this idea outward far enough. It's a an environment
that anticipates everything you need before you even are able
to consciously think of what those needs might be. That's
(01:01:15):
that's kind of the end destination that people really hope
to get to. That was your body as a computer
interface from the Forward Thinking podcast. You know, we don't
do that one anymore. I really miss it because there
is nothing like sitting down with some of the smartest
people you know and just talking about cool potential stuff,
(01:01:40):
like things that are on the cusp or just emerging
uh or or perhaps are only hypotheticals, and kind of
casting your mind forward and imagining what maybe obviously. Predicting
the future is always a tricky thing. You never really
want to double down on that because you could be
wildly wrong. But it was fun to do. If you
(01:02:04):
have never listened to the Forward Thinking podcast, I suggest
to check it out. Just go through, look at the episodes,
see if there are any that strike your your fancy. Um,
because I feel like we did a lot of really
good work now. Granted, also those episodes are several years
old now. Some of those we were predicting things that
either definitively came to pass or did not um and
(01:02:26):
if they did come to pass, they probably looked very
different from the way we imagine them. But yeah, I
think it's a really good show that you know, you
can check out. There's also a Forward Thinking video series
that I did several years ago that's all up on
YouTube if you want to check those out. Um. I
occasionally get notifications about people still watching those episodes, and
I haven't done one in years, but it's very gratifying
(01:02:50):
to know that people still watch it occasionally. Anyway, if
you have any suggestions for future episodes, or you just
want to say hi, anything like that, a couple of
different ways you can do that. You can go to
the I Heart Radio app and navigate to the text
stuff page and there's a little microphone icon. You click
on that and you can leave a voicemail message up
to thirty seconds in length, or, of course, you can
(01:03:11):
reach out via Twitter. The handle for the show is
text Stuff H s W and I'll talk to you
again and hopefully sound a billion times better really soon YEA.
Text Stuff is an I Heart Radio production. For more
podcasts from my Heart Radio, visit the i Heart Radio app,
(01:03:34):
Apple Podcasts, or wherever you listen to your favorite shows