Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_00 (00:00):
Hi, I'm David
Eisenberg, Assistant Professor
of Information Management andBusiness Analytics at the
Feliciano School of Business,and I'm here proudly joined by
Jorge Fresneta, who holds theHerbal Chair in the Department
of Marketing at NJIT, New JerseyInstitute of Technology, at the
Martin Tubman School ofBusiness.
(00:20):
Hi Jorge, thanks for comingtoday.
SPEAKER_01 (00:22):
Thank you very much
for inviting me.
Thanks.
SPEAKER_00 (00:25):
Today I want us to
talk a little bit about the
business analytics, the NeuralAnalytics for Business Lab at
NJIT, how that got started andwhat we're doing with it now.
What are some of the things thatyou're most excited about that?
SPEAKER_01 (00:41):
So the the um
reality about uh neuroanalytics,
uh marketing analytics uh interms of uh neural sensors, is
that it's something that uhneuromarketing 10-12 years ago
uh require like crazy expensivedevices, uh devices that only a
(01:01):
few universities could afford,so big, so noisy, so and there
is a uh growing uh uhmini-rization, making these
devices smaller and smaller andsmaller and making more
available for everybody and moreaccessible for everybody, for
marketing research, for businessresearch.
(01:24):
Um, and there is a little bit ofa revolution in in this area
with with these devices beingmore accessible, more portable,
uh which allow researchers tocapture and collect information
in more like realistic settings,more like comfortable settings.
(01:44):
You don't have to lay downwithin a crazy, noisy,
claustrophobic machine tocollect data anymore.
Obviously, this comes at aprice.
I mean the accuracy of theseportable devices, small devices
are not um the accuracy of thismonstrosity of machines, but it
(02:09):
is reasonably good, reasonablyhigh, and we can use it for many
different applications, uh, suchas finance research, marketing
research, many differentapplications where we can
actually identify emotions.
Uh, we can identify and quantifyum specific emotions and see how
(02:35):
these emotions lead to certainpatterns and behaviors, and and
I think that is a very, very,very exciting field, uh, very
accessible.
Um data science, also, which isbehind that interpretation of
all these signals andinterpreting identification of
(02:56):
these emotions, these feelings,uh, is also contributing a lot
to the field, uh, helping usidentifying and quantifying
these signals, interpretingthese signals into emotions.
So we are we are again kind oflike a little bit within a
revolution now of these devices,real life applications of these
(03:20):
uh devices and and theirapplication for business are
endless.
SPEAKER_00 (03:26):
So when um so one of
the things that you know that
that popped into my head whenyou were when you were talking
is is the idea of theminiaturization of all of it.
How do you think that theability for these devices that
can potentially um you knowunderstand and sense people's um
you know feelings potentially bybecoming small and therefore and
(03:50):
and also less expensive?
Um how do you think that affectsum you know consumers and the
potential for businesses tocreate sort of entrepreneurial
or new um inventions or devicesto be able to for these things
to be um you know both eitheramenable or um or available to
(04:13):
the public at large?
SPEAKER_01 (04:15):
So for consumers is
is um there is a very exciting
challenge in the sense that umcan a device uh know you better?
Can a device tell you what youreally really really like in
(04:36):
your inner yourself what youreally really really want better
of what you even like thinkabout?
So I think that there is um veryexciting uh application of of
this in again like finance,marketing, that are the fields
that we are uh more interestedright now of actually uh
(04:58):
developing and and providing umproducts and services that are
really really really whatcustomers like individuals, what
individuals really, really want.
And in many cases they don'teven know that that they they
want it.
Um so I think that that would beone of the I don't know if it's
the best uh application of it,but I think that is the the one
(05:22):
that is excites me the most.
Being able to to truly, trulycustomize uh offerings in terms
of products and and services forfor consumers.
I think that that would be my mynumber one.
Sure.
SPEAKER_00 (05:36):
And um so so tell us
tell us a little bit about the
lab that we've set up um at NJITum that uh that focuses on
neuroanalytics.
SPEAKER_01 (05:47):
So we are basically
working with two different
devices.
One is called EEG, basicallyit's an encephalogram, and
capture like uh uh brainactivity, brain waves, brain
actually like electricity inyour in your brain.
And electro uh encephalogram isbasically what is used in
medicine to see that there'slike mental activity in a in a
(06:11):
brain.
And we combine it with anotherset of devices that are called
um GSR, that basically kind oflike is in in plain English a
lie detector.
So we combine those devices.
We have uh an specific spacededicated in in NGIT where we
collect data from fromindividuals that volunteer to
(06:31):
participate in our studies.
They receive uh very briefly,very quickly, very easily, they
receive some kind of likestimuli, and we use these two
devices, EEG and GSR, to collectuh the body um response, uh uh
(06:54):
physical and and mental responseto these stimuli, and we collect
all these data, we collect allthese signals, and we use um uh
machine learning uh algorithmsto kind of like translate, clean
and translate those signals andidentify these um emotions in
(07:16):
after the the stimuli that thevolunteers so we have basically
a a setup space with computerswith these EAG EEG uh and GSR
devices, and we have the thebasically the space to uh uh get
these stimuli um to our umvolunteers and collect all this
(07:38):
information, translate all thisinformation and uh do research
on that information.
So this is basically the settingof our lab.
SPEAKER_00 (07:47):
And uh yeah, so
yeah, so I I was actually um you
know, I helped set up the laband Jorge and I, you know, run
the lab together.
Actually, um for those of youwatching from Montclair State,
there are Montclair StateUniversity students who have now
participated in this lab, andwe're you know, we're always
looking for um you knowadditional interested students.
Absolutely.
SPEAKER_01 (08:08):
We need a lot of
help.
So please please volunteer.
SPEAKER_00 (08:13):
So um and and we're
actually in the process of um
you know of uh putting togethera grant application where we
might be able to recruit evenmore students for that purpose.
Um and uh fingers crossed.
Um so um so in any case, uh theum one of the things that you
(08:34):
know that you mentioned wasmachine learning, which is
sometimes used as or considereda um you know a synonym or a um
or a part of um the you knowmore overarching term of
artificial intelligence or AI.
Um how do you think that theability for um for software or
these kinds of devices, whenwhen applied in in you know in a
(08:58):
consumer way, could feed intothe data that is that then is
able to be interpreted andprovide um you know feedback or
dynamic interaction fromartificial intelligence.
SPEAKER_01 (09:12):
So artificial
intelligence um in the end is
going to be able to like providelike real-time recommendations
for for consumers.
That is the we are in theinfancy of that, but these
devices, these sensors aremaking and being more and more
(09:32):
and more available, more andmore portable.
Some of them even like arealready included in some like
smart devices.
Um so I think that throughartificial intelligence, in the
long run, I don't know how manyyears, but we are as a consumer,
I'm going to be able to getthese recommendations in in real
time.
(09:53):
Uh recommendations, again, thatI may not be aware of something
that I need, that I want, that II'm striving to get.
I think that uh artificialintelligence is going to be a
key element.
It is already and is going to bea key element in providing
recommendations and suggestionsin real time for consumers.
SPEAKER_00 (10:15):
And so something
that we haven't directly started
addressing yet in our lab, butum but which I think is really
interesting and something thatwe could potentially be
eventually investigating is umis in the within the area of um
of kind of AI and robotics, umthe idea of um you know there's
(10:35):
a whole field of kind ofrobotics and AI becoming
emotionally aware.
And and so the the notion ofrobots or AI being able to
understand and and respond toour emotions, how does that
change potentially the way thatwe interact with with software
that functions as artificialintelligence?
SPEAKER_01 (10:58):
I mean, in many,
many, many ways.
I mean, uh think about a verysimple example, um customer
service.
Uh if uh consumers are going tobe instead of like interacting
with some kind of like callphone response system or
something like that, tosomething that actually can
(11:19):
respond to uh have a response tomy my feelings, my emotions to
that that is a game changer, atleast in customer service, in in
service marketing.
Uh, and and I can think of likemany other applications, finance
as well.
I think that that would be thatwould give a human touch of all
these robotics and all that.
(11:40):
I think that that from a againfrom a consumer.
I'm a marketing professor, soI'm always like, from a consumer
perspective, I think that is agame changer intensive least in
service marketing.
SPEAKER_00 (11:51):
Certainly a lot more
um, you know, engaging or
responsive than the than thanwhen I call and it says press
one for this and press two forthat.
SPEAKER_01 (12:00):
Robotic call boys.
SPEAKER_00 (12:02):
Umdeed.
Um and uh so um how do you thinkthat um the ability for um for
uh you know AI and for softwareto become emotionally aware of
people's emotionally responseand therefore on some level
become empathic to people's umyou know needs and and and
(12:26):
feelings?
Um how do you think thataffects, say, and and now we're
kind of crossing over frommarketing a little bit into um
you know into uh MIS orinformation systems and maybe to
some extent HR, but how do youthink that affects like the
future of work, the ability forAI to kind of become in the role
(12:46):
of people, um, you know, in theworkplace or work with people?
SPEAKER_01 (12:51):
So um you can uh
think of many ways, uh, or there
are many potential ways ofapplying that.
Uh uh the first thing that comesto my mind in in this sense is
that um different people aremore effective in different
different circumstances.
(13:12):
So it would be very, veryinteresting to be able to kind
of like assess, identify butunder what like emotional
circumstances people are moreeffective.
Uh there are people that theywork very well under pressure,
which probably is my case.
I need deadlines and I need likelike to have something coming up
(13:32):
to like I need pressure to be tobe effective.
Some other people don't.
Uh they kind of like diluteunder under pressure.
So I think that uh beingemotional, being able to uh
recognize, identify, quantify uhthese emotions, and somehow
(13:53):
measuring how effective peopleare under certain like emotional
states or under, I think thatcan be also a game changer in
terms of making us like betterperformers, uh better uh
workers, employees, by kind oflike replicating that that or
(14:14):
trying to find that the thatemotional state that helps you
being a better employee, abetter manager, a better etc.
etc.
So I think that these can havealso a very important
application for the future ofwork and can help us eventually
be better employees andperformers in our job.
SPEAKER_00 (14:35):
And in in in the
space that I'm thinking of,
which is which is kind of, Iwould say is sort of you know
service work, which may even beconsidered the crossover
between, you know, the area ofyou know consumer or
marketing-oriented work and andyou know and day-to-day work.
Um you know, I'm I'm thinking ofwhen I'm at the you know the
(14:55):
supermarket or the Walmart, thatthere's this, that there's this
tall, you know, uh robot that'slike six feet tall and it has
this creepy smile on its face,and and it comes up behind me
and beeps really loudly, andthen I freak out.
And and so, you know, now nowhow would how is it different if
something like that is able tobe optimized or evolved to a way
(15:20):
that it has that it'semotionally aware, that it's not
just coming up behind me andfreaking me out without
realizing that it's freakingpeople out.
SPEAKER_01 (15:27):
I mean the I think
that the key word here is trust.
I think that humans uh trust uhor distrust what is not like
empathic, but it's not likeemotional aware, what is not
like uh reflecting or connectingto your to your emotion.
I do think that the the key wordhere is trust and trusting
(15:49):
devices that are emotionallyaware of your of your feelings,
of your emotions.
SPEAKER_00 (15:54):
So that's
interesting.
Because I think yeah, I thinkyou you can imagine across a lot
of different devices that arecurrently trying to engage with
people, that it becomes, youknow, very fast for people to
mistrust.
I know that's I know the idea oftrust is something that you were
working on even before we gotinvolved with um, you know, with
with neural sensing and allthat.
SPEAKER_01 (16:15):
Yeah, that was the
the even from the very beginning
of e-commerce in in the 1990s,was a key element people tend to
distrust the early beginnings ofAmazon and e-commerce because
you didn't know what thosecompanies were going to do with
with your credit cardinformation and shipping.
I don't want to provide myshipping information.
(16:37):
So trust has been always uh akey element in everything that
has to do with technology andtechnology um aspects of of our
interaction with with uh humansand and technology.
And trust is is something that Ido I do think that something
that is emotionally aware cancan build that trust and let us
(16:58):
like trust more these devicesand these these services.
SPEAKER_00 (17:03):
That makes sense.
Um and so what are some of thewhat are some of the projects
that you know that we'recurrently um you know working on
in the neural analytics lab?
SPEAKER_01 (17:13):
So we are working
basically uh in two different
areas.
We are working on on finance andfinancial products, uh trying to
come up with a system that canrecommend and and can provide.
You can first and foremostunderstand the role of emotions
in making financial decisionsand buying financial products,
(17:38):
and eventually being able toprovide a better recommendation.
Once we understand uh the roleof emotions and how people make
uh emotionally loaded decisionsuh in terms of finance, uh we
want to apply that to to makelike better recommendation
systems for uh people toidentify or get financial
(18:03):
products.
So this that is like like firstmajor area of work.
Uh we have a second major areaof work that has to do with
marketing and assessing uhproducts, uh assessing a
specific type of product that iscalled experience uh products,
(18:24):
experience products as a littlebit the name says, are products
that you really need toexperience to kind of like
assess, evaluate.
You need to watch a movie, youhave to be sitting in a uh
watching a movie to evaluate themovie.
You I can think of I want towatch this Star Wars movie, but
I actually have to watch it, Ihave to experience to assess how
(18:47):
good, how bad it is.
So um in marketing evaluatingexperience products, um, a video
game has been always uh aproblem uh because there are not
clear features ofcharacteristics that you can
assess.
You don't have a processor withthese characteristics, you don't
(19:08):
have a laptop with this memory,you don't have specific features
that you can use for to assesssomething, you have to actually
be expo experienced the product.
So we are embarking a project touse our web device, our devices
to evaluate uh experienceproducts and kind of like get a
better idea of how people umevaluate, assess these
(19:31):
experienced products, and againlike use that to develop better
products, betterrecommendations, etc.
etc.
So these are the two major areasthat we are currently working
on.
We got IRB approval for for allof this.
Um we have many other projectsthat we are uh developing, but
(19:53):
these are the ones that we haveuh currently like working on uh
actively.
SPEAKER_00 (19:58):
And when when we're
talking about experience, um I
think one of the key features ofthat is the ability is the fact
that experience is somethingthat is that occurs over time.
And so it's not something likethat I'm you know looking at a
series of products on a tableand I'm kind of saying that's
the best one.
Yeah, you know, and and so ifI'm watching a Netflix movie or
(20:20):
listening to music or somethinglike that.
That's easy to do two hours.
Right.
How how does it so how does itmake a difference that the that
something that is that's aneural sensing device, something
that maybe um can be embedded,is currently being embedded in
the iWatch in one of theupcoming um iterations of the of
(20:41):
the new Apple Watch, um and youknow, and or in other devices.
I myself have a pair ofheadphones, which I got in a
grant, which you've seen, um,where which allow me to um which
which allow me to listen tostereo music, look just like
regular stereo headphones, butum, but it's made by the brand
that we use for the EEG andelectroencephalography devices,
(21:05):
and um, and therefore can bescanning my emotions and my
brain while I'm listening tomusic.
Um, how is the ability for thatto happen in real time um over
the course of time differentfrom something which is which is
just assessing an emotion um youknow for a moment?
SPEAKER_01 (21:22):
So that's that's
very that's a key element of of
what we are doing.
Um a movie can be like two hourslong, and there is uh there is a
very long period of time thatyou need to collect and you have
to assess all that uminformation and you have to make
of uh an educated assessment andall these signals and all these
(21:45):
emotions are leading to likingor disliking a movie, wanting to
to watch it again, sharing theuh social media, this movie was
so amazing.
That is something that again hasbeen like puzzling uh marketing
marketers for a long, long time.
Uh, we think we have uh a way tomeasure that that long uh
(22:09):
experience exposure to a movieand an experienced product,
something that is not like aseasy uh of assessing as a pen
that you use it and that's it.
Uh so I think that we can uhhopefully provide a way of
measuring, evaluating thisproduct with our um portable
(22:32):
devices, real time.
We can actually collect datathrough like a whole movie.
And again, it's not like aclaustrophobic machine making a
lot of noise or distracting, youknow.
You can't be like sitting in acinema movie theater with the
device on and going through thewhole experience in a more
realistic um setting.
(22:53):
I I do think that uh it cancontribute to to actually
provide a good measurement or agood evaluation of these um uh
experience products.
SPEAKER_00 (23:03):
In some ways, one
metaphor that comes to mind is
is how during the presidentialelections, if you're watching
certain network television showsor something, that um you know
that you'll see that there arecertain people who have been
hired to kind of sit at homewith sort of these devices where
they're pushing, you know, goodor bad on, you know, over and
over again.
(23:23):
And um and so in in order toinward signal to the show, and
they're quantifying that um toshow to tell the show like how
much people are enjoyinglistening to you know one one
political candidate or theother.
And and in this case, it's it'sit's it almost baffles the mind
to imagine how that could betransitioned into something
(23:46):
where you know millions uponmillions of um of uh you know of
iPhone users or iWatch users umcould be um could be essentially
um you know automaticallygauging even more instantly
throughout the providing in realtime and also in many cases not
(24:06):
being aware, which is anotherimportant not being like aware
of the device and collecting thedata itself that gives you a
more realistic measurement.
Um and uh oh and in terms of thein terms of the finance aspect
that you were talking about, umwhat do you think are um what
would be an example of someoneum you know making a decision of
(24:32):
a financial decision, let's say,based on emotion, and how might
that affect their their decisionmaking?
SPEAKER_01 (24:39):
Uh risk and risk
assessment is uh probably the
first thing that comes to mymind.
Uh people uh tend to not have avery good idea of the level of
risk that they are willing orthey really think that they can
take.
Uh there are many, manyinstances of people buying um
(25:04):
initially buying uh financialproducts thinking, oh, I want to
take whatever risk it takes, Iwant to make a lot of money.
And I think that the riskassessment from from uh
financial products and buyingfinancial products is one of the
areas that I do think that uhpeople tend to do a lot of like
(25:24):
emotionally loaded decisions inin finance.
And and one area again I thinkthat is risk.
And I do think that we cancontribute to really give you an
idea of what is the real levelof risk that you are willing to
assume or take.
I think that risk assessment isprobably one of the areas, the
(25:47):
initial area that uh we can helpwith with our devices and our
methods.
unknown (25:52):
Cool.
SPEAKER_00 (25:53):
Um all right.
So um I guess um, you know, Iguess with that, I think this
is, you know, I mean, I'mbiased, of course, but I think
this is really exciting researchthat we're doing.
Thank you, thank you very much.
SPEAKER_01 (26:05):
It is, it it is.
SPEAKER_00 (26:07):
And you know,
hopefully, um, you know,
hopefully uh, you know, morepeople will become interested,
and um, you know, and if uh ifany of if anyone listening is
really interested in this kindof work, feel free to reach out
to us um, you know, at um youknow, either at NJIT or at
Montclair State.
And um thanks for you knowthanks for being part of this.
SPEAKER_01 (26:25):
Thank you for
watching, thank you.