Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Shawna (00:00):
Hey everybody, welcome
back to the Mindful Bytes
podcast.
Today, we're going to betalking about Mark Zuckerberg
flexing his meta AI muscle atMeta Connect 2024.
So, with that, let's go aheadand dive on in.
All right?
So let's go ahead and introduceour panel.
We're missing one person, whichdoesn't always seem like that.
(00:23):
The Gen Z is the one that'smissing a lot.
Does anybody?
Olivia (00:26):
else.
Yeah, yeah I think you need togo back and listen to that
episode about stereotyping Agegroups.
Shawna (00:37):
Hey, I mean, I'm not
stereotyping.
He's a Gen Z and he doesn'tshow up.
He's always out somewhere, outabout traveling seeing the world
.
Brian (00:46):
Don't make sound like
that.
He always shows up when he'ssupposed to.
Shawna (00:49):
That's right, this is
scheduled time off.
It is, and next week he will bejoining us to talk about these
meta Ray-Bans that I'm actuallywearing right now.
But before we get started,let's go ahead and introduce the
panel that did show up.
Brian (01:01):
Hi, I'm Shawna.
I am your electronicallyunimpressed Xenio.
Olivia (01:05):
That was a good one.
Shawna (01:07):
Unimpressed.
I like that.
Olivia (01:08):
I'm Olivia, a millennial
on social media, more than the
Kardashians.
Shawna (01:13):
I'm Brian, your Gen X
business leader slash digital
guy, and today we're going totalk about Meta Connect, so I
want to dive into this and getyour guys' thoughts on some of
the things Did.
Shawna, I know you weren't atMeta Connect.
Olivia, did you catch anythingfrom Meta Connect?
Okay, cool, all right.
Brian (01:31):
I do want to say one
thing about it.
I don't really think that youshould be involved in, like you
know, dating conferences.
Shawna (01:41):
It's not a dating
conference.
So those of you that sound likeit to me.
Olivia (01:45):
And it's not a dating
conference.
So those of you that sound likeit to me and it's like oh,
let's not go there don't gofinding yourself
Shawna (01:53):
a metaboo please oh my
gosh, we're gonna have so much
to explain just in this first 30seconds of this podcast.
So no, well, I do want to talkabout some of the stuff that
came up during Meta Connect andI want to just kind of touch on
a couple of things and let youguys share your thoughts.
(02:14):
First, I want to get to theMeta AI stuff because that's
where really a lot of stuff ishappening.
But I know some people might betuned in and be like OK, well,
tell us about the new VR headset.
Yeah, just real quick.
It's the Oculus 3S.
It's a little bit lower thanthe Oculus 3, the mixed reality
headset.
The difference is there's a$200 difference, so that's a big
(02:35):
deal.
You can get the Oculus 3S for$299.
And the big difference on it isgoing to be the lenses.
So they're not as clear, butthey are still mixed reality
Great headset.
So if you're wanting somethinga little bit cheaper, Oculus 3S
is the way to go.
But go check that out.
Brian (02:54):
Before you move on, is it
called the Oculus 3S?
Because that doesn't soundright to me.
Shawna (02:58):
Yeah, let me go ahead
and fix that.
So, let me go ahead andrephrase it is actually called
the Quest 3S that.
So let me go ahead and rephraseit is actually called the quest
three S.
Oculus was the original name,but it's the quest three S.
Olivia, when we got into, whenwe first stepped into the
metaverse almost three years ago, do you remember what we said
about the experience when westepped in there?
Olivia (03:16):
Yeah, I mean we talked
about how it was being able to
connect like we've never seen,you know, like with social and
all of that yes, yeah, that'sexactly.
Shawna (03:28):
I was wondering the they
.
There's so much that haschanged.
Uh, you know we post stuff oninstagram and stuff like that.
But I want to point out, likezucker, zuckerberg and them made
a statement and they said,basically what they're saying is
their app, which is what we'reon, is called horizon world, and
it's continuing to get moreupgrades.
It's really getting a lot ofupgrades.
They said, basically whatthey're saying is their app,
which is what we're on, iscalled Horizon World, and it's
continuing to get more upgrades.
It's really getting a lot ofupgrades.
(03:49):
This year, we all just got newavatars which look way more real
, which is pretty cool.
But outside that, they made astatement that said this is
their social app and they are asocial company, so they're going
to continue to invest in thisbecause they see this as the
next platform for social.
So anybody that's in socialmedia, you can't ignore these
(04:12):
things.
They're continuing to evolveand it's happening very quickly.
I mean, in three years, therehas been so much that has
changed and it's becoming moreand more easily accessible and
the connection is like we'venever experienced.
That's why the dinosaur is inthere, guys.
The tech dinosaur is in thereBecause she realized.
Shawna, tell us why you got in,why you decided to join us.
Brian (04:34):
Yeah.
So I had a headset, but Iwasn't really into it.
I didn't really enjoy it.
It was not that comfortable forme.
And when the two of you werehosting your show, you know, I
started to hear all the greatstories that were coming out of
that, all the great connectionsyou were making.
And so when Olivia needed abreak from hosting and didn't
(04:59):
want to be tied up in theevenings as much, I was like put
me in, I want to do it, let medo it, and so.
And now I mean I don't thinkyou'll ever get me out of that
seat.
Shawna (05:08):
Well, I wanted to bring
that because I know Olivia, you
know we're in social media andwe're all about connections, and
I know Olivia, we're going tobe in there again, I think the
end of this month, the fourthThursday, doing the Mindful
Bytes.
We'll all be in there at thestudios doing that live and
you're going to see how much haschanged these avatars.
It's crazy.
So very cool things.
(05:28):
They're also bringing in metaai into it.
That's already in there rightnow where you can talk and ask
questions, uh.
But what really is cool is thisis something uh over.
I guess it's been two years agowhen I brought up to somebody.
I ran into somebody at meta andI said how far are we away from
being able to step into a blankworld and say create this and
(05:48):
AI will create it for us?
Well, they showed during that,during their Meta Connect, they
showed a preview of someonebuilding a world and actually
typing it.
They would like a generator,yellow, with rust on it, and AI
generated it, created it forthem to put in their virtual
space.
So it's going to become eveneasier for people to build
things, add things to the world.
(06:09):
But not only that.
Another thing that I thoughtwas really cool, and then we're
going to transition is thatthey're going to bring in the
abilities which I think theysaid is going to happen the next
year, where we can create AIavatars.
So they call them NPCs.
Non, what does that stand for?
Does anybody know that?
That's where our Gen Z shouldbe in here.
Brian (06:28):
It's non-player character
, I think.
Shawna (06:31):
Yeah, Non-player
character.
I think that's right Way to go.
Tech Dinotaur.
Olivia (06:38):
But she knows everything
, yeah, if I'm yeah, I'm not
impressed.
Brian (06:44):
If I'm right about that,
it's because I learned it from
the movie Free Guy.
You remember that.
Shawna (06:49):
Oh, yes, yeah, the free
guy.
The quote that we have in ourvirtual studio that says I might
have created this, but I can'tlive here, is that what it said?
Is that what's not the quote?
Brian (06:58):
Yes, I, I created this
world, but I can't live my life
in it.
Shawna (07:03):
Yeah, yeah, yeah.
So a lot of stuff coming whereyou're going to.
That's going to be cool.
Imagine, imagine, at the studioKiller Bee Studios in there
where we host our shows, havingavatars that are AI.
So when people come in duringthe show, like between shows,
when we're not there they canask the AI questions, they can
give them tours of the studio,they can connect them to us in
(07:27):
real life, like make aconnection, say hey, somebody
from your virtual studio wantsto say this to you and it come
in.
I mean, these are things that'sgoing to start closing the gap
in between these two, I guess,reality and virtual and mixed
reality.
So it's going to be cool to see.
So pay attention to it.
I'm telling you it's going tocontinue to evolve in that area.
Um, I guess, olivia, what areyour thoughts on that?
(07:48):
Before we go on to the, theflexion of mark zuckerberg's
meta ai muscles, um, and I'vegot pictures.
Olivia (07:56):
No, I'm just joking.
Uh, yeah, no, I think that youknow meta has really, even with
what we've seen with Instagram,that they're really wanting to
create connection.
Like really focusing on thatagain right about and making it
(08:21):
easier to be able to connectwith people in there, I think
just falls in line with theiroverall goal that they're trying
to do.
So, yeah, I think that's greatand I'm excited to see what
happens with all of this as well.
Shawna (08:41):
You know, I love that.
I love it when we look back andrealize that it was a few years
ago when we started buildingthe studio.
The big thing was like we wantto build for connection, because
we feel like that's the future,and now everything gosh.
I mean Meta even called theirconference.
Killer B is on to somethinghere.
(09:02):
Let's change this call to MetaConnect so you're welcome, mark,
but I mean it is the connectionis so important it's going to
become more and more importantas we move forward.
So if, if you guys arelistening to the podcast, you're
like I don't understand whatare you talking about, send us a
message in the show notes.
I'll post a link.
You don't have to have aheadset to come join us.
You can join us from yourmobile phone to kind of test it
(09:26):
out and see what is this allabout.
We'll put a link in the shownotes, but, yeah, we would love
to hear your thoughts on this aswell.
All right, let's transition towhat really impressed me the
most at this conference.
First off, let me go ahead andclarify.
I did not fly out to Californiato go to this conference.
I know that they will send outinvites and bring people in, so
(09:47):
hey, I mean, if Meta wants tofly us out next year, hey, we'll
take that.
That's cool.
But what's great about it isyou don't have to.
We can go to these conferencesin virtual reality and you would
be blown away by what it's like.
They have great teamenvironments where you can
unlock digital assets together,working together, icebreakers
where you're doing challengestogether with people that you
(10:08):
meet from around the world A lotof fun.
But the video conference islike a 3D video.
I mean it feels like you'rethere, so it's hard to explain.
You just have to be in there.
That what really set it off forme was, I'll be honest, I've
messed up a lot of different AIplatforms when it comes to
(10:29):
generating content, doing photos.
We've been messing with videosright now as well.
We've seen some of the thingscoming.
But Mark went over a lot ofstuff that they're doing with
their meta AI and at first Iwasn't impressed with meta AI
I've been using, I've messedwith it with the last couple
years.
I'm like, yeah, it's just notthere.
(10:50):
But what he showed at thisconference was a game changer.
I was like it really got myattention.
So here's some of the thingsthat he showed.
First, he was talking aboutnatural voice, so this is kind
of already.
On chat GPT, they just did anupdate, but if you're using the
app, you can just click the micand talk to chat GPT and it can
(11:12):
talk back to you.
So it's a lot easier thantrying to type all that stuff in
.
So that's cool, and they'retying it in with like
celebrities and stuff as well.
So that's neat.
Now I will let you know.
There is a blog on Killer Bee,killer Bee's website.
I'll put a link to it in thepodcast show notes.
It's just go to kbdigitalcomand you can find it there.
But I will post the link to itif you want to read more.
(11:34):
But I'm just going to kind ofskim through this so we can talk
about it.
So the natural voice is reallycool.
The thing that really got myattention at first was this
imagine this, imagine edit.
I don't know if either one ofyou have ever used like the.
Do you guys know what I mean,olivia and Sean, when I say
imagine edit.
Do you guys know what I'mtalking about?
Brian (11:56):
Are you talking about a
plug in for chat GPT?
Oh no, you're talking aboutmeta AI.
Shawna (12:00):
Yeah, I'm talking about
meta.
So what?
Okay, so these image platforms,like the one that we use a lot
as mid journey, they have like,they have a, like a.
You have to put into the chatlike forward slash, imagine or
image, and then you can tell itand it will create an image.
So they that's why they kind ofcall it I'm not using bad
grammar Imagine, edit, how youthink, image, edit, but they
(12:22):
call it imagine because youactually have to type that word
in.
So they've made some changes tothat.
But I started messing with itand I was blown away because the
edit part was something new forme.
We use MidJourney to say, hey,create this image.
If you look at the blog on MetaConnect that we posted, there's
actually a picture of MarkZuckerberg flexing his meta AI
(12:45):
muscles and he's got like robotarms and he's flexing his biceps
.
That's all generated with midjourney.
Now, the further down in thatblog, you're going to find a
where on messenger you can openup messenger and this is so easy
.
Now you can actually click thelittle image, choose an image to
upload.
Now, when you do that, if youjust send the image, it'll
(13:07):
actually come back and tell youin text everything that it sees
in that image.
So that's one cool thing thatyou can do, which plays into the
role.
Even with meta and theseRay-Bans like that I've got on
right now these meta AI Ray-BansI can ask it to tell me what
I'm looking at and it sees andtells me in my head what I'm
looking at.
They're partnering with acompany to help people that
(13:29):
can't see with these glasses.
The glasses can actually willbe able to read to you what it's
saying.
I'm getting too far in theglasses already so I got to kind
of pull it back here for asecond.
But with the Imagine Edit, youupload the photo and you can say
you'll see a sample of this ontheir blog.
I uploaded a photo of Ashton,you guys, and when I uploaded it
(13:51):
he had like a gray fedora on,he had like a red shirt and some
blue jeans and I uploaded that.
It told me everything it sawand I said, hey, can you change
his?
Can you put a brown leatherjacket on him?
And it put a brown leatherjacket on him and it put a brown
leather jacket on them withinseconds and it looks like he's
really wearing the brown leatherjacket.
Then I said, well, can youchange his fedora and make it
white?
So the fedora became white.
(14:14):
And then, lastly, I was like,hey, can you guys can?
Can you go ahead and change hisjeans to be black?
And it did that, and it wasmind blowing.
Actually, let me go ahead.
And it did that and it was mindblowing.
Actually, let me go ahead, Idon't know.
Let me go and show you guys.
So I know, Shawna, you saw it.
I want Olivia to see it too,because you can kind of see,
this was the progress right here, olivia.
So start off with this redshirt, brown leather jacket,
(14:35):
white hat, brown leather jacket,black jeans of white hat.
Wow, wow and like it reallylooks so real.
And, for those of you listening, those pictures are on that
blog.
If you want to see them, yeah,definitely go check it out.
But I mean editing like this Ihave never seen with AI.
(14:55):
So I was like, whoa, this is a,this is a big deal.
So let me stop there.
What are your guys' thoughts onthis?
Right now?
Just on that.
Brian (15:03):
Go ahead, Olivia, you can
go first.
Olivia (15:06):
I mean, yeah, I think
that's really cool.
It also, you know, like youtalked about, gives blind people
the ability to even create agraphic if they wanted to like,
or be able to express a picturethat they have in their head,
(15:26):
right, or something like that,which I think is cool.
So it kind of gives them avoice, but also I see a lot of
like scary things to it as well,because I'm already this is
your week to talk about all thered flags, so please go ahead
(15:50):
and yeah, like well,specifically with, like you know
, the terrible hurricane, thathurricane helene.
Right, there are a lot of aiimages of presidential
candidates in a boat on thewater carrying babies that
(16:11):
people believe are true, that acertain candidate is out in the
water where another one is not,and so to me the scary part is
just how quickly people believethese images because they do
look so real.
So that's what's scary to me isthat.
Shawna (16:36):
Yeah.
Brian (16:36):
So, brian, is there a
thing I know, like there are
tools where you can put writingin and say was this written by
AI?
Is there something like thatyet, where you can scan, like a
photo or a screenshot from yourphone and ask if it's AI?
Is there a tool like that?
Shawna (16:52):
That's a good question.
I don't know of one, but I'venever searched for one, so
that's really good because Iknow they have them for writing
and stuff like that, but I donot know about for graphics.
Brian (17:02):
Now I do know like if you
look at those images you'll see
the ones to the right and theone in the middle has a little
watermark in the bottom lefthand corner, but just like
anything else, that can beeasily removed yeah, so yeah,
I've been thinking about thatbecause I thought, you know,
it's going to be hard to trustthings that you see, even as far
(17:25):
as, like, you can say, look, Isaw your husband with another
woman, or I saw this or that.
When I saw these pictures thatyou created of Ashton, I was
shocked because I thought, mygoodness, like I would never
have guessed I was AI.
You know, up till now, a lot ofthe photos that you see that
(17:45):
are AI, have that certain lookto them where you can just tell
you know a lot of them onFacebook and stuff like you
would just know right away.
And now it's getting harder andharder and harder to tell.
So, you know, people are goingto have to be wise and not
believe everything they see on apicture on Facebook.
Shawna (18:05):
So true on a picture on
Facebook.
Brian (18:05):
I mean so true of all the
silly things, like on one of my
plant groups.
People keep posting these fakeand it's a Monstera group
actually and they keep postingthese AI Monstera's that are,
like you know, as big as a houseand listen, they can get big
and in the rainforest and downhere in Florida even you know
(18:27):
they're who knows, like 50 feettall.
But it's like people arefiguring out that they're ai and
then the people admit thatthey're ai.
Olivia (18:36):
You know people aren't
always going to admit when
they're lying and it could besomething that's life-changing
that people are making yeah,there's even even this popular
plant shop near where I am,where they did a post because
people kept coming to get thisparticular plant.
(18:57):
That was like I think it wasblue or something like that, and
they were like that is not areal plant.
Shawna (19:04):
Like what is it?
Olivia (19:05):
AI generated pictures.
So that is like the thing, too,that like people are going into
businesses, like looking forthese things that you know, oh
my gosh.
Shawna (19:20):
Well, you know, even you
know, you talked about the
hurricane Shawna.
You and me were just watchingreels.
I'm like.
I was like did you see thisvideo from the hurricane?
And then we're both sittingthere.
I was like, wait a minute.
And you said the same thing.
You're like I think this is oneof those AI generated.
There's people that's using AIto generate reels of storms and
it's not real footage.
You're using footage fromsomewhere else and they're just
(19:42):
trying to build up those views.
It up those views and it'sputting that fear and those
things, those thoughts, inpeople's minds and it's not even
real footage.
So it's so such a different timethat we're stepping into where,
again, connection and beingreally connected to these things
are going to be make mean somuch more because they try to
tell them apart.
(20:02):
It feels like we're almost likerepeating this almost every
time we meet about the ai.
But there's a lot of concernshere and this very cool tool,
Ashton actually said hey, canyou send me that photo?
Because I want to actually goand find those things as an
outfit.
So that's cool.
He wants to go and he's like Iwant that brown leather jacket,
I want the black jeans orwhatever.
(20:23):
So he's going to go match allthat up with a white hat, cool,
but in reality, uh.
So it can be used, good, butalso you have to be careful.
I didn't even think about, like, the spouses and stuff like
that, Shawna, like that's.
That's crazy to even bethinking about that.
Brian (20:38):
So yeah, um, like someone
could really try and ruin
someone's life, like by creatingsome, some things you know for
like revenge or something likethat, for sure yeah, yeah, I
literally see this, sorry goodI'm just gonna say I can see
(20:59):
this even changing the way that,like, investigators, have to
work and the things that theyhave to be thinking of, the
things they have to have in mindand tools that they use.
Shawna (21:10):
So I can tell you this
Now it's going to only take a
little bit of time before itgets better.
But the photos that I wasgetting from Meta AI's Imagine
Edit thing was that when Idownloaded the photo it wasn't
even close to being theresolution as the original.
So I could zoom in and it gotblurry pretty quickly.
So that was a good thing.
But that's only going to.
That's going to change, becausemid journey you can actually
(21:33):
take a mid journey now and say,create this image, which
actually I'll just.
You know the image up here atthe top of Mark flexing.
That's all from mid journey.
So it created that image fromwhat I typed in and then I could
say let's bring the increase,the resolution, so I can make it
really big, high quality image.
So if I download it you canzoom in really closely and it's
(21:54):
fine.
So that stuff's.
It's gonna get better on thatside too.
But I've never experienced theediting part.
That's a whole different thing.
It's not just creating a newimage, it's great and it's
editing one.
So I couldn't even imaginetrying to do that on Photoshop
the hours that that would havetaken.
So yeah.
Game changer, game changer so alot of cool stuff there.
So, okay, let's get a littlebit more, uh, talk a little bit
(22:16):
more about some game changingstuff that is even going to
change it even more.
The new meta AI studio Uh, so Iwon't.
There is a video on the blog,so go check it out, but olivia
and sean just kind of give youguys a visual what this is.
They with the new meta aistudio that's coming out you.
(22:37):
They showed an example wherethey created a video version of
this guy.
His name is Don Allen StevensonIII.
He created an AI version ofhimself.
Zuckerberg was live and he wason the phone and you can see
Zuckerberg's video at the topright-hand corner.
(22:57):
He was on a FaceTime.
The rest of the phone was avideo of that guy and it was his
AI and he was standing therenext to him.
He started asking his AIquestions and the guy responded
back.
The AI assistant responded backto him, looked like he was
talking, sounded like him, andhe's using that assistant.
Olivia and Sean about how thisis okay, so this is cool.
(23:20):
You were talking about minichat not too long ago, olivia,
so I wanted to bring that up.
How you said mini chat can helpassist things.
These virtual assistants canrespond to comments and all that
stuff for him.
So imagine like that scale ofnow you have a video AI
assistant that looks like you,sounds like you pretty mind
(23:44):
blowing there.
Definitely go check out thevideo on the blog.
You're going to love this ifyou're in social media and
you're doing reels and stuff.
They actually showed an exampleon there of how they're using
AI to this will be coming out toreels and all that where when
you upload a reel, it'lltranslate that reel into
(24:06):
multiple languages and it'll dubthe video so your lips actually
sync with it.
And they showed examples of itand you couldn't even tell like
they would.
I mean I could actually dubmyself talking English properly
and then other foreign languages.
Let's fast forward.
I got two more things just tocover real quick.
I'm not going to cover thisthis much, but this is something
(24:28):
new.
This is the meta ray bandsright here.
So these are the they.
They showed these like I don'tknow if you can see them.
Can you see them?
Yeah, they.
Is it focusing?
Brian (24:37):
No, Well, okay there we
go.
There it is.
Shawna (24:40):
So this is the limited
edition of the meta ray bands.
They brought out the rightclear, but I couldn't hear you
anymore because the speaker's inhere, so let me make sure I can
still hear you, can you guys?
Brian (24:50):
talk.
Yeah, can you hear me?
Shawna (24:51):
Yep, I got you Okay, so
I'm listening to you on this
meeting.
On these glasses Now Ashton'sgoing to go deep into all the
stuff the glasses can do, but Iwant to talk about what they
covered at.
This is really cool and this iswhy we ended up buying a second
pair.
So Ashton's got a pair now andI bought these.
The AI abilities inside theseglasses Last year you could talk
(25:13):
to Meta AI and that was cool,but they have expanded that.
You can do videos, recordings,all that stuff.
That's the stuff Ashton's goingto be talking about.
I think he might have someexample.
He's going to show, too, whatthey covered on here.
This is like another great usecase.
Sean and me we're looking attraveling outside the country
(25:34):
for events to go and be a partof.
With that people were tellingus hey, it's going to be very
important that you know you havea guide with you that can read
the different languages.
Well, with meta ai on theseglasses, which I just started
playing the video from thecomputer on that by tapping my
glasses I just don't touch theglasses when you do that, um,
(25:57):
when you touch them yeah, don'ttouch the glasses.
Um, when you're out of thecountry, get this you can look
at.
You're going to be able to lookat signs in different languages
or at the airport and ask theglasses to tell you what that
says in English.
Brian (26:15):
Wow.
Shawna (26:15):
So it's going to be able
to read that and translate it.
They even showed where you cando translation in real time with
somebody else with another pairof glasses on, where you can
have a conversation and whatever.
If they're talking Spanish,your glasses.
People can't hear that unlessthey get like really right up on
you.
The audio is very cool how theygot it worked out, but I can
hear what they're saying inEnglish.
(26:36):
In my language.
They can hear what I'm sayingin Spanish.
Very cool things there, even tothe point of you're going to be
able to scan QR codes.
With these things now, whichyou can actually already do, you
can look at a QR code and sayscan that QR code and it opens
it up on your phone by justlooking at it.
You don't have to take out yourphone, take a camera picture.
The other cool thing is ithelps, as we talked about at the
(26:59):
beginning, trying to rememberhow long have we been in the
metaverse.
It can actually help remind youof things, so like where did I
park my car?
It can remember where youparked your car.
I don't know if you ever get atthe airport and you take a
picture of the number thatyou're at.
You can just look at it and say, hey, remember this, it
remembers it.
There's so many things that youcan do.
Again.
It's on the website.
(27:20):
Go check it out.
What are your guys' thoughts ofthat before we go to the last
thing?
Brian (27:33):
go ahead, olivia.
Olivia (27:35):
I think they're cool.
I'm excited to hear what Ashtonhas to say, so I will keep my
comments and surprise until thenit's very cool.
Shawna (27:46):
I know something that
Ashton didn't test.
I will tell you this.
I took a picture today with theglasses and I asked the glasses
, hey, can you send my lastphoto to Shawna?
And I said you want me to sendthe last photo to Shawna.
I said yes.
I said okay, sent.
So it took the photo that Itook of my glasses and you can
do it with videos or whatever,and you can send it to somebody
(28:06):
right after you do it.
So a lot of cool stuff and itplays well with social media.
So I think, olivia, you'll likethat stuff too.
So, all right, let me go aheadand go to the last thing.
So this was the big reveal andit was really cool.
You definitely got to go checkout on the blog, watch the video
.
It's right at the very end.
You can skip right to it.
It was very cool how theydelivered it.
(28:27):
It was very James Bond-ish.
The guy came walking out withthis metal suitcase handcuffed
to him and everything, and theybrought out to zuck.
And it was cool that if youlook up the guy on instagram, he
actually recorded it with hisglasses too.
So you got the first personview of him actually walking out
to zuckerberg at the conferenceand zuckerberg opening this
(28:47):
thing up really cool, uh.
But anyways, they brought outwhat they're they're called
their orion glasses, their Orionglasses, the Orion AR glasses.
It's the first of its kind.
It was mind blowing, it waslike back to the future stuff
and it's like it's like all likeholograms and stuff like that
(29:08):
that people that come in yourhouse even like to even explain
it like you could be sitting.
They kind of approach the, thechallenge that we have today of
technology, where you're sittingat a dinner table or you're
sitting out at dinner and all ofa sudden your your phone dings
and you grab your phone andpeople we see that happen all
the time.
They actually touched on that alittle bit about how this
(29:29):
technology was created to bringus together, but it's kind of
like become like an interruption, but it's kind of like become
like an interruption.
So with this, these AR glasses,some of the stuff they showed
and it's again, it's just toomuch for me to explain the
podcast.
You've got to go check out theblog and watch the video.
So it uses a neural interface.
What it is, it's a braceletthat wraps around your wrist and
(29:51):
they showed a person using themand they had, like Gary Vee,
all of them using it to test itout and they were blown away.
Gary Vee even said this in thevideo.
He said when TV came out, itkind of took over radio.
And when the smartphone cameout, it has taken over TV.
And he said that these Orionglasses are basically going to
(30:16):
replace computers.
It's like the next big thingand we saw that coming with AR.
But what was cool is theyshowed just like an example with
that bracelet.
There was a guy that was sittingin a group with his friends and
they're eating and stuff and amessage came in on his glasses
and it popped up right in frontof like in the space.
(30:36):
Nobody else could see it and hedidn't have to move his hand or
anything, anything.
He had his hand beside him andhe just went like this.
I guess, using his eyes, hecould select it is what I'm
assuming but just by using hisfingers, pinching his fingers
together beside him, it realizedlike, hey, he could respond to
it, he could close it, but theyactually have holograms of
people coming up and meetingwith people.
(30:57):
I don't know, it's just, it'sincredible.
You definitely got to go checkout the video.
I'm not going to ask you guysyour thoughts on it because I
don't think you can share yourthoughts until you see the video
.
Brian (31:11):
But it's a cool
technology, how, right now, you
have to have cell phone freetime.
You're going, you have to havelike cell phone free time.
You're gonna have to haveglasses free time, like I'm
gonna have to be like put yourglasses in your office.
So that is a question I do have.
Olivia (31:25):
So yes, is it.
Can you get your prescriptionor is it general, like for me, I
have to wear contacts or Ican't see, and when I'm not
wearing contacts, I have to wearcontacts, or I can't see, and
when I'm not wearing contacts, Ihave to wear glasses.
So is that going to mess up myvision to have those over my
(31:46):
contact, or?
You know what I mean?
Do you see what I'm saying.
Shawna (31:50):
So let me clarify too.
So the Orion glasses they justreleased the developer version
of it, so they're testing it.
I have no doubt that it's goingto be able to do that.
Olivia, I mean, I'm suspectingthat, since they actually gave
the teaser now and they broughtit out so everybody could see
what it could do, my guess isit's probably going to be out
here within the next two yearsthat we actually see it where we
(32:12):
could buy it.
That's what I'm guessing.
I don't think he would havebrought it out that early.
A bite.
That's what.
That's what I'm guessing.
I don't think he would broughtit out that early.
I could be wrong, um, but Idefinitely think they'll be able
to do that because, like eventhese ray bands, you can get
them in prescription.
So I'm guessing that you candefinitely do that.
Like these glasses right here,they will transition as well.
So if I go outside, they turnlike a sapphire blue.
The lenses do so they tint, butyou can't get them
(32:34):
prescriptions.
So I'm sure that if they'regoing to do ar glasses
especially he said right nowthey're working on the developer
, they're going to be workingout, building more apps.
So once it does come out to theto the public.
There's going to be lots ofapps that you can use.
Uh, which is something thatthey probably learned and made a
note from from apple's release.
They didn't have a lot of appsto come on the apple vision pro,
(32:56):
so there's that.
He also said they're working ongetting the glasses down a
little bit, like they want toget the size down more, because
when you wear them you're likeyou look at them, you're like
those are cool, but I'm notgoing to wear those like they're
a little too big still, likethick frames.
But you know, there's so muchtechnology in it.
I mean these glasses right here.
It's cool to see the clearversion because you can see all
(33:17):
the electronics inside of it andthey're all in the size of a
pair of sunglasses.
So very crazy that you can gettechnology like that in glasses
like this.
But to see that they're goingto get the size down and they're
working on getting the pricelowered so it's more available
to a broader audience.
So that's what they're workingon right now.
So, yeah, I think that we willsee that, but again, I think
(33:40):
it'll probably be a year or twobefore we actually see it in a
customer's hands.
Brian (33:45):
One big concern I have is
what a risk that's going to be
for people who are driving, whocan then have images in front of
their face.
Because, you know, I mean, likewhat is it?
Is it like on terminator orsomething where we saw like a
idea of this, like where thingspop up and tell them is that
what I'm thinking of, terminator?
Shawna (34:07):
I just remember, I think
, any robot movie.
I think they can do that.
Yeah, it's like where thingslike anything, that's like
futuristic.
Brian (34:13):
Yeah, you probably won't
see it on jurassic park oh well
I'm just joking uh, just I wasthinking about how Ashton had
mentioned, like you know, youwould be able to like put a tv
show or something up in thecorner for you, like if you're
working out you know you wouldbe able to watch that.
But like no one's gonna be ableto stop you from doing that when
(34:34):
you're driving, and you knowit's not like police officers
are gonna be able to stop youfrom doing that when you're
driving, and you know it's notlike police officers are going
to be able to tell what kind ofglasses you have on when they're
driving by.
You know what I'm saying.
Shawna (34:43):
Yeah.
Brian (34:44):
And I'm sure navigation
is going to be part of the
technology that's available onthere, so it's just going to be
tricky.
Shawna (34:51):
Yeah, I think that's
good.
I mean, I'm wondering, likethat's a good thought.
I'm wondering like that's agood thought.
I'm curious if they'll putsomething in there, like if it
can recognize that you're movingat a certain speed.
I don't know, it's veryinteresting.
But then again, if you're on anairplane, you want to be able
to use them.
Brian (35:04):
So I'm sure, like they
got all these things to be
thinking about, that's goodNobody can drive at that
velocity so that we start whathappened.
Shawna (35:16):
if we're getting a
flying car soon, I mean, maybe
it might be a problem that wassupposed to happen a long time
ago.
Brian (35:23):
What is the?
Shawna (35:24):
word to the jetsons yeah
, well, you know I think that
this that the orion glasses veryinteresting.
In the blog I actually statedthis might be the closest thing
we've seen to time travel.
It is pretty cool to see theholograms and stuff.
So we are getting morefuturistic.
But these are really goodthoughts to be thinking about.
(35:47):
Like I wonder how they'll playwith those things.
That'd be interesting.
Brian (35:50):
I did just think of one
more thing, which maybe the
answer to all of this isself-driving cars, where it
doesn't need our attention somaybe zuck and elon can make a
partnership right.
Shawna (36:04):
I don't know if we'll
ever see them partnership
partner together I wouldn't putit past.
Brian (36:10):
You never know what
people will do well hey
everybody.
Shawna (36:12):
Thanks for tuning in for
this week's mindful bias
podcast.
We would love to hear yourthoughts, so make sure you click
the link Mindful Bytes podcast.
We would love to hear yourthoughts, so make sure you click
the link in the show notes.
Text us.
We'd love to hear your thoughts.
Any of your questions?
We'll be talking about thosehere.
Guys, is text in our livemetaverse version of Mindful
Bytes on the fourth Thursday ofevery month, so make sure you
fill those in so we can talkabout them.
(36:33):
We'd love to talk about them.
We'd love to talk about them,and next week we'll be again.
Ashton will be back from hislittle getaway again and he'll
be talking and diving deep withus, diving in deep with us about
the meta Ray-Ban glasses, alittle bit more about video side
technology and all that stuffthat's happening, and I'm sure
everybody else have some coolstuff to talk about too.
So make sure you join us then,if you like the podcast, don't
(36:55):
forget to click review and leaveus a good feedback.
We would love to hear what youthink about the podcast.