Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Welcome to Lake Nona, a beautiful residential and commercial oasis
where the future has arrived. Lake Nona is a seventeen
square mile community in Orlando, Florida that has established new
standards of living that integrate the latest technology into every
facet of life, including, but not limited to the way
(00:24):
its citizens get around. Picture this. A person stands in
the warm Florida sun at a designated bus stop, waiting
for the next shuttle to arrive. And here it comes,
not with the roar of an engine, but with the
gentle hum of an energy efficient electric mona. The busk
glides to a halt, and as the doors open, something
(00:45):
is missing. There's no one in the driver's seat. That's
because Lake Nona is home to one of the country's
largest and longest running single site autonomous vehicle fleets. These
energy efficient, self driving buses have transformed the way residents
travel in this community, safe and easily accessible. They whisk
people from place to place, freeing hands, reducing traffic congestion,
(01:09):
and embracing a sustainable future. What else can a world
of autonomous public transportation do? How else may impact the
way a community operates in this bright and sunny corner
of the world. The horizon is limitless and our journey
is full of possibilities. Hey there, I'm Grain Class and
(01:36):
this is technically speaking, an Intel podcast. The show is
dedicated to highlighting technology is revolutionizing the way we live,
work and move. In every episode, we'll connect with innovators
in areas like artificial intelligence to better understand the human
centered technology they've developed. Thus far, we've explored how AI
impacts society in the ways of agriculture, accessibility, and mental health.
(02:01):
But one of the ways technology and especially artificial intelligence
impact society is through its structures. AI is advancing the
ways cities are able to serve their citizens. There's a
very interesting example of this happening in a small town
in the United States. But before we go any further,
I need to introduce my guests. Joining me now is
(02:21):
Joey Morow, the CEO of Beep, which is a company
that offers autonomous mobility solutions in public and private communities
across the US. His career has spanned the technology arena,
from hardware and software to IT services. He has spearheaded
groundbreaking enterprise projects in cutting edge startups to multi billion
dollar enterprises, Joe's expertise in innovation, strategy and transformative technologies
(02:46):
paved the way for his role at BEEP, where he
now leads a new team transforming mobility as we know it.
We are so excited to have you on, Joe.
Speaker 2 (02:55):
Thank you, Graham, glad to be here.
Speaker 1 (02:57):
Also joining us as Juan Santos, the senior vice president
of Brand, Experience and Innovation at Tavistop Route. At Tavistop,
he's part of a multi disciplinary team that uses design
thinking to build places where people can thrive. One is
a recognized expert in design thinking, user generated content, virtual
worlds physical and digital, and loyalty and rewards. Welcome to
(03:20):
the chop one.
Speaker 3 (03:20):
Thank you very much.
Speaker 2 (03:21):
Green.
Speaker 1 (03:25):
I'll start with you, Joe. Can you just tell us
a little bit more about Beep and in particular your
personal story around why you decided to get involved with
the company.
Speaker 2 (03:36):
Yeah, I'm happy to Graham, and thanks again for having us.
So Deep was founded on the premise that autonomous mobility
is going to be proven out in I'll see incremental
use cases. I know everybody has had different experiences and
or has read a little bit about what driving and
mobility is about. You know I would tell you if
(03:57):
you think of the technologies and the world work that
we're doing, it's very focused on shorthaul first mile last
mile type use cases in public and private communities, solving
for that micro transit gap across many areas of our country.
Second is very important that it's a shared platform, so
(04:17):
we focus on more controlled speed GEO fenced use cases,
but in a shared mobility form factor, meaning a shuttle
that seats a ten to twelve passengers and really represents
that ability to provide a good balance of yes, personal mobility,
but also community mobility. So the business was founded by
(04:39):
a group of us that are also investors in the company.
We've been entrepreneurs across a couple of funds, so we're
venture capitalists as well as operators. And again, as we
looked at this key inflection point in the area of
technology specific to autonomy, made a very calculated approach to
focusing on this micro segment of the larger market of
(05:02):
autonomy around this electric shared autonomous mobility in these micro
transit use cases.
Speaker 1 (05:12):
BEEP is a turnkey mobility solution with the goal of
providing stress free transportation, reducing carbon emissions, and improving road safety,
offering autonomous transportation to thousands of people. Beep's technology focuses
on community and offers localized travel solutions that reflect the
way people want to engage with their neighborhood. Are these
(05:34):
vehicles going to be driver lists or driver assisted? How
is that currently being played out?
Speaker 2 (05:40):
Yeah, it's a great question. We work in partnership with
the US Department of Transportation, who oversees the use of
these vehicles on our roadways today. So the vehicles are
operating in a very high percentage fully autonomous, but we
do have safety attendants or ambassadors on board whose responsibility
(06:01):
is to both educate welcome passengers and introduce them to
the technology, help them feel comfortable with these types of services,
but also to take over manual control should that be
needed if there's an event on the roadway that requires
some level of intervention. Fast forward a couple of short
years and those attendants are going to be virtual or remote.
(06:26):
So we will in our types of services always have
a human in the loop. It will shift from being
an onboard attendant to a virtual attendant. And you can
only imagine, especially in the area of public transportation, if
there is some circumstance, be that a traffic jam or
a pothole on a roadway or some other eventuality. You
(06:50):
still have to be able to communicate with passengers on
board if there's a reason to pull a vehicle off
the side of the road, let people know what's going
on and what to do about it.
Speaker 1 (07:00):
Okay, great, I'll bring one into that discussion. Now, could
you just tell us a little bit about your work
at Taviasop Group.
Speaker 3 (07:08):
So I lead innovation and a brand experience in what
most people would traditionally think of as a development company. However,
Tavistak Development, which is the area that I focus mostly in,
is not your traditional developer. We are actually an owner operator,
and in the case of BEEP, we have a place
called Lakenona where directly contiguous to the Orlando Airport. We're
(07:32):
proud citizens of the city of Orlando, but we represent
an advanced district in the city, and it's a fairly
large advanced district. We're approximately seventeen square miles to give
you a point of comparison, Manhattan's twenty two, so it's
a fairly large swath of land. And then we have
pretty much every use case inside like no. I mean,
we have universities, high schools, people can go to preschool,
(07:56):
there's micro apartments, there's large homes, so it comes this
really interesting place for people to live, but also for
companies that are on the forefront of technology to use
us a living lab. The reason BEEP is a critical
partner for Lignona is because we believe mobility is one
of those things that create a lot of friction inside
(08:18):
a community. Right you come to a place and parking
is difficult, moving from one place to the other. That's
really kind of like they're not so enjoyable, not so
great parts of being in communities that are successful. In Lignona,
we've tackled that friction with immobility by a variety of things,
but we've also incorporated BEEP under autonomou shuttle operation as
(08:40):
a critical part to provide that first and last mile
mile and a half inside the community for people to traverse.
And it's something that has been running now for multiple years.
We have what I believe today is the largest and
longest running autonomous shuttle operation in the United States in Leagnona.
It's actually is so prevalent now that we're coming close
(09:03):
to the end of the year where we had a kid,
you know, last Halloween actually dressed up as one of
the autonomous shuttles. So it's something that's both an incredible
service that reliefs striction, but it's become a natural part
of the ecosystem that people live with and live in
in Lakedana.
Speaker 1 (09:21):
Yeah, I'm interested in how that autonomous shuttle bus started
and was there any I guess pushback or were any
challenges with the community to try and get this sort
of thing deployed.
Speaker 3 (09:35):
Actually, it was incredibly well received. It started in a
conversation with the founders of Beep. We were actually having
a conversation about a different topic and the topic of
autonomous mobility came up. And after that conversation, fast forward
eleven months and the company had been created, the vehicles
have been brought into the US. We've worked with Department
(09:56):
of Transportation and NITSA to make it happen, and from
a community perspective, we actually did an outreach process where
we actually allowed critical members of the community to be
a part of understanding what the vehicles would do. For example,
we had a specific day where the beeps were on
preview just for first responders, so We showed our police
(10:18):
department and the fire department how to work with the vehicles,
how to operate them, how to move them if necessary,
and when the vehicles rolled for the first time, we
had a community that was ready, so we didn't have
much pushback. Now we had people have to adapt to
having a vehicle with no driver, right because even though
there's a safety attendant on board, the vehicles operating on
(10:41):
its own, and it operates differently than a humanly controlled vehicle.
So we had some situations where people were like learning
to interact with them. But for the most part, it
was very well received. One of the hallmarks of known
as a community is that our citizens, they think of
them sells almost like citizen scientists. They're almost asking us
(11:03):
what's new every week. It's like what's the new thing
to try. They've come to expect strange things to happen,
you know, in the roads and other places in Lignona.
So I think it was significantly better received because of
the education that we did, because the first responders were
on board, because we gave community previews, so it was
not like suddenly, you know, self driving car shows up
(11:27):
in the middle of the community.
Speaker 1 (11:28):
Right, Okay, and in terms of I mean we've talked
about the autonomous side of things and the AI. Are
there any other AI techniques or technology that has been
used for general community planning and development? Are there any
other tools out there that is currently being used?
Speaker 3 (11:45):
So from a legnano perspective, it's pretty significant. We actually
have a very detailed data overlay that actually shows us
how the city is behaving. Everything is private, so there
is no personal identifiable information being collected, but we collect
a wide variety of behaviors. I know, you know how
(12:06):
long people wait for an uver, I know the specific
state of parking garages. Every spot is instrumental, so we
know if there's a weight for them. We know how
the beaps are flowing inside the community, and that is
fed into a large data environment where we actually use
AI driven tools to both predict and model the behavior
(12:28):
of the environment. We've done presophisticated prediction on mobility using AI,
but we also use it for energy consumption. We use
it to detect unknown patterns like, for example, the impact
of having pets in the environment and how that changes visitation.
So when you look behind the scenes at what allows
(12:49):
Lakenna to operate and what allows BEEP to find such
a fertile environment for testing and operating these vehicles. Here
there's a significant amount of AI and data that actually
powers our community.
Speaker 1 (13:04):
Yeah, that's pretty cool. Just as you're describing the amount
of data and be able to find all their stats.
It just reminded me of the SimCity series of games
that I used to play quite a bit, and using
that to make decisions to make your citizens happy.
Speaker 3 (13:19):
I may have said once or twice that I get
to play SimCity with a real city to a degree,
so I know exactly what you mean.
Speaker 1 (13:31):
We'll be right back after a quick break. Welcome back
to Technically Speaking an Intel podcast. When you think about
(13:52):
AI in our environment, the question of oversight often comes
into play. How did these tools manage incidents in the community.
What metrics or data are used to determine when an
AI tool should engage or intervene. I often think of
the pacemaker as an example of how AI can be
used to positively impact our lives. A monitoring system that
(14:14):
is set up to only act when a severe change
has occurred. BEEP is creating a system with checks and
balances that can be more reliable than humans in reporting incidents.
Vehicles are constantly collecting information inside and outside around what
it observes and encounters that can make the community safer
and more efficient.
Speaker 2 (14:36):
If you think of the in cab and environments and
you think of the scenario of not having a person
of authority on board, there is no driver, there is
no attendant. In the future, I mean, we're developing tools
and techniques that monitor the activities of the writers to
(14:56):
ensure we understand that if there is a health of
you know, somebody crouches over in their chair as an example,
if there's an unfortunate situation like somebody were to present
a weapon. You have to think of all these types
of use cases, and what's critical about that is being
able to process that observation and quickly align that with
(15:22):
how we would get some communication into the vehicle and
or immediately dispatch support or services. You know, one of
the things that is so important about these vehicles is
in the event of an incident, you have the perfect eyewitness.
Every time you're videotaping what's happened in an intersection, you're
(15:48):
leveraging that information and data to measure exactly how did
an autonomous vehicle respond, and so an important piece of
leveraging data in the future for the work that we're
doing is going to really reinvent how we do things
like supporting police activities out there in the area of
(16:12):
data collection and determining fault in scenarios, but most importantly
taking that data back and improving situations that may be
hazardous to roadway conditions that result in accidents and things
of that nature. Externally, if you think of all the
data that is being collected, simple things that we're able
(16:34):
to determine by being out there on the roadways in
these different traffic scenarios are used to improve traffic flow
and one hit on some of the things they do
in standing road infrastructure that can also be done in
the data that's collected through these vehicles. There are scenarios
where public works departments can utilize the data and we
(16:56):
can send them examples of where a tree lim is
growing out over a power line, or potholes in the road,
or other circumstances that may create a safety issue that
need to be addressed. And so there's just an enormous
amount of observation that's going on every time we are
(17:17):
on a route that that can serve so many important purposes,
just to proactively address things before they come problems.
Speaker 3 (17:25):
I think it's pretty unique that you have now these
autonomous vehicles moving throughout communities. They carry people and provide service,
but they're also a very accurate scanner. Right. Autonomous vehicles
have cameras, they have light ar. When you ride the beeps,
you actually see in a display what the vehicle is seeing,
(17:47):
and it's like recording every minute detail of the environment,
and it's a three D view of the world around it.
So it's I think a unique opportunity and one that
we haven't fully utilized yet of having this objects that
are three D scanners that are traversing the community thousands
of times a month, and they can provide us with
(18:07):
an incredible amount of information. So I think it's a
unique opportunity and one that we haven't utilized as much
of the data that the vehicles generate as we could.
Speaker 1 (18:19):
But there's a lot more to Lakenna than their revolutionary
public transportation. One that stands out to me, which I
hope more towns and cities will consider, is Wi Fi
access for all its residents, something that's quickly becoming an
essential utility. Lakenona is also home to the most technologically
advanced hotel in the world, the Lake Nona Wave Hotel.
(18:41):
Beyond the new fangled tech for residents and visitors, Lakenona
also considers itself a living lab community where companies and
innovators can connect, collaborate, and test their prototypes and ideas
in a real world setting. And in terms of the
partnership with Intel, when I'll start with you, what were
(19:03):
some of the technologies and help that Intel provided your project?
Speaker 3 (19:08):
So we are primarily an Intel shop when it comes
to processing. We utilize Intel CPUs for a variety of
the data that we collect, and we're even experimenting right
now with Intel GPUs as a way to actually do
some of the heavier data processing. So it's one thing
(19:29):
that's always running and always behind the scenes from our perspective. Now,
we have a variety of partners like people that actually
engage in some of the more advanced technologies that Intel
has to offer. But from our part, it's a strong
combination of tried and true you know CPUs and you know,
(19:50):
we're getting some pretty interesting performance results from Intel GPUs
now that make them usable for a variety of data
crunching tasks for data sets that we find interesting.
Speaker 1 (20:02):
Yeah, I just want to switch now a little bit
to the safety side of things. I've actually got a
bit of a background in mining, and I was around
with the advent of the whole autonomous mining vehicles with
those huge dump trucks being in a loaded and driven
without any drivers, which is a real site to see.
Going through some of that technology, they had a very strict,
(20:26):
multi layer approach to safety. There was like seven tiers
right down to people having actual buttons they can press,
and it just shuts everything down. How have you tackled
the approach of safety, particularly in a much more open
environment than a mind sight.
Speaker 2 (20:42):
First, I would tell you as you look at autonomous mobility,
safety is the primary driver of why these technologies exist.
You know, in the US, ninety four percent of all
accidents and many tens of thousands of fatalities a year
a result of human distraction, impairment, and error, and that's
(21:04):
a well known fact. Obviously, taking some of the faults
of the driver out of the equation by utilizing technology
that's never distracted, never impaired, and always on is an
important aspect of this. It's not just about achieving an
equivalent level of safety, which is a common phrase used
(21:27):
at the standards of how do you choose to put
an autonomous vehicle on the road. You have to prove
that it's equal to or better than the driven vehicle
in the eyes of our government, the US Department of
Transportation and NITS in particular. Well, if you think of
the opportunity and one hit on some of the technologies
(21:48):
in Lake Nona to have roadside infrastructure that is looking
down a roadway, communicating with our vehicles and telling us
that the tree jectory of a particular car at a
particular speed is telling us it's very likely to run
that red light. So it's not just about the vehicles themselves,
(22:10):
it's about that entire connected infrastructure and how you use
other technologies to give you views of scenarios or predict
the event that may happen. Given the information that we're
perceiving from roadside infrastructure or intersection infrastructure, that can be
fed to these vehicles to dramatically improve safety and reduce
(22:36):
some of these scenarios that candidly a human would never
see or understand from their vantage point just behind the
wheel of a car. And so I think those things
are equally as important as the great work that's going
on with the autonomous platforms themselves.
Speaker 1 (22:52):
Now looking into the future, Joe, as you know, AI
is evolving very rapidly, particularly around generative AIL and even
just the visual AI capabilities. With new GPUs coming out
all the time, how do you place SPEEP strategically so
to take advantage of any sort of new technologies that
come out, and so that you're keeping ahead of the
(23:15):
competition and also be able to serve your communities better.
Speaker 2 (23:19):
If you look at the future of autonomous mobility, obviously
the market that we are focused on, and you think
of expanded use cases and evolving from what today in
our world are planned services, planned routes, geo fenced areas,
(23:39):
and the broader that you expand the horizons of the
types of environments that these vehicles would ultimately traverse and serve.
It's just going to be very very critical that we
as a business stay out in front of how we
leverage AI to improve what these vehicles are able to do.
(24:01):
It's going to be imperative for our business model to
succeed by utilizing the technology and the AI technologies in
particular to be able to understand, perceive, and properly respond
to these situations that are out there, both on our
roadways and in our vehicles, so that we can provide
(24:23):
a safe, convenient service for expanded use cases across the country.
Speaker 1 (24:29):
Did you want to add to that?
Speaker 3 (24:31):
Definitely, and maybe fast forward a little bit more into
the future. Today, we use AI and we use the
tools that we have in our toolkit to make things
safe and efficient, right, and that's definitely the right order
to take. I mean, safety is the number one concern
(24:53):
and then making sure that it's efficient. But then once
you tackle those I think AI opens the opportunity for
things that are very unique. How about the vehicle recognizing
that the persons that are there, because we're able to
look into their schedules, they have an extra two minutes
and there's a side road that could be calm right
(25:18):
where they could see a lake, or what if you're
able to figure out that there's a live event going on,
and instead of having only the opportunity for you to
attend because you're there, the system automatically redirects the non
essential traffic to one where you can actually listen to
live music as you go in I think the experiential
(25:40):
opportunities of this intersection between technical AI for efficiency for safety,
couple with let's call it human understanding powered by AI.
They open these intersections that we haven't thought about. Right,
maybe when we get the next version of your routing
(26:01):
on your GPS, when you pull it in your phone,
it's not going to say avoid tolls. It may say
bring my blood pressure down right. It may say let
me discover the place that I'm in. That's the thing
that really excites me is sure we'll use the tools
to make sure we tackle the technical so that we
(26:22):
can deliver the experiential.
Speaker 1 (26:24):
Okay, Finally, I like to sort of wrap it up
with some ethical type questions. We talked a little bit
about data privacy and user privacy. You do work with
a lot of local governments and local municipalities. I'd like
to get your thoughts on how do we strike that
balance or even if indeed there is a balance, or
should be just ensure by default that it users privacy
(26:48):
is sacrisanct.
Speaker 2 (26:50):
First, I mean, obviously, even with the data collected, we
have to honor the PII restrictions and other things that
exist in our and certainly respect that right to privacy.
I will tell you that a lot of the information
that's gathered is not to identify details of an individual.
(27:12):
It's about taking that collective body of information to predict
certain outcomes or events and identify certain behaviors that would
enable us to address the situation or perform a different service.
But very very critical we're able to capture these images
(27:32):
and the information that we do to ensure we're improving
the safety and performance of these types of platforms and
work within obviously the respected boundaries that we all have.
Speaker 1 (27:45):
For our audience. Can you just define the PII sure?
Speaker 3 (27:48):
It's personally identifiable data usually a collection of things that
can allow you to identify a personal like, for example,
your name, your address, your telephone number, and in some
other cases things like your biometrics like your face or
other things that are uniquely attachable to you. I mean
(28:09):
other environments and other users of data I think have
a much tougher situation because they have to deal with
personally identifiable data to conductor business because who you are
is critically important to how they deliver the service. It's
not yet for what we do, and by just not
(28:30):
collecting the data and then making sure we have no
opportunity to actually look at one individual only collective data.
We put ourselves in a situation that we are not
infringing into people's identities or privacy.
Speaker 1 (28:45):
That's good to know. Thanks Joan one for your time today.
It was really great talking to you and I've learned
a lot.
Speaker 3 (28:54):
Thank you, Graham.
Speaker 1 (28:54):
Yeah, thanks very much.
Speaker 2 (28:55):
Enjoyed it.
Speaker 1 (29:00):
I would like to thank my guests Joe Moy and
Juan Santos for joining me on this episode of Technically Speaking,
an Intel podcast. I gained significant insights from my guests
today and I hope you found it enlightening as well.
My primary realization is that AI and technology have the
power to shape and nurture local communities. I'm always inspired
(29:20):
by grassroots solutions as opposed to overarching, top down strategies.
Both Joe and Ian emphasize the criticality of data privacy
and the necessity to protect users' personal details, particularly since
they are working with local governments and agencies. On a
technical front, it's evident that BEEP is adapting and evolving
in its approach to autonomous vehicles. Currently, their shuttle models
(29:41):
are facilitated by attendants, but the trajectory suggests that in
a few years, these shuttles might operate autonomously with minimal supervision.
Watching this transformation unfold is genuinely and exciting. While it's
easy to be captivated by new technology, and I'm no exception,
it's crucial to prioritize the user experience and the ti
tangible benefits it brings to enriching lives from the Roman
(30:03):
aqueducts to present day innovations. It's the relentless drive and
commitment of visionaries like Joe and Juan that propel us forward.
With a touch of luck and their pioneering spirit, we
may soon pave the way for a future that would
leave even the Jetsons and all. Please join us on Tuesday,
December twelfth for the next episode, when we will learn
(30:23):
about how Intel's AI for Workforce program is making learning
AI more accessible. Technically Speaking was produced by Ruby Studios
from iHeartRadio in partnership with Intel and hosted by me
Graham Class. Our executive producer is Molly Sosha, our EP
of Post Production is James Foster, and our supervising producer
(30:45):
is Nikias Swinton. This episode was edited by Cira Spreen
and written and produced by Tiree Rush