All Episodes

February 20, 2025 • 32 mins

Unlock the secrets of internet privacy as we journey through the intricate world of digital cookies and their profound impact on our online lives. Featuring insights from Garrett Johnson and Tesary Lin of Boston UniversityQuestrom School of Business, we explore the evolution of privacy regulations and their influence on major browsers like Chrome. Discover how first-party and third-party cookies have transformed digital marketing, and learn about Google's innovative Privacy Sandbox initiative, which seeks to balance user privacy with the needs of advertisers and publishers.

We also shine a light on the controversial "dark patterns" in user interface design that can manipulate cookie consent rates, challenging the effectiveness of privacy regulations like the GDPR. Delve into the complexities of data protection with a focus on the Global Privacy Control and the potential for a unified U.S. federal privacy law. As we navigate the emerging landscape, we address the responsibilities of AI platforms and social media in safeguarding user data and managing content. This episode promises to equip you with a deeper understanding of the ongoing challenges and future implications in the realm of internet privacy.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
J.P. Matychak (00:20):
Thank you, greetings everyone, and welcome
to another episode of theInsights at Questrom podcast.
I'm JP Matychak and alongsideme is my co-host, Shannon Light.

(00:40):
Shannon, how are you?
I'm great, thanks, okay.
Well, today we're talking aboutcookies.
Not those cookies.
That's a different podcastaltogether.
No, we're talking about digitalcookies and internet privacy,
and if you've been anywhere onthe internet lately, you're all
too familiar with the banneracross the bottom of pages

(01:01):
informing you that the site usescookies and asks you to accept
the use of them.
And if you're anything like me,you accept the statement
without fully appreciating whatthat even means.
But cookies may not have muchlife left in them.
Since 2020, google has beenworking to eliminate the use of
cookies across its Chrome webbrowser, which, according to

(01:24):
statcountercom, dominates thebrowser market with over 65% of
users worldwide.
Since Google made theirannouncement, we have seen
significant changes and they'vebeen testing their privacy
sandbox with the goal of phasingout the use of all third-party

(01:44):
cookies by the second half of2025.
Here to talk to us aboutcookies and the broader issues
around internet data securityare Garrett Johnson, associate
Professor of Marketing, andTesary Lin, assistant Professor
of Marketing, both from BostonUniversity, Questrom School of
Business.
Garrett, Tesary, thanks forjoining us.
So let's start by level settingour listeners.

(02:09):
What are cookies exactly?

Garrett Johnson (02:13):
Well, you see, there's chocolate chip cookies
oatmeal.
Nobody likes oatmeal raisinones, but the chocolate ones are
my favorite.
That's my favorite.

J.P. Matychak (02:21):
It says a lot about me.
I know that's my favorite.
That's my favorite.

Garrett Johnson (02:24):
It says a lot about me, I know.
So if we're talking aboutbrowser cookies, browser cookies
essentially allow the web tohave a memory.
So, for a browser cookie, if youwould go to a website and you
would find a product you want topurchase and you'd add it to
your cart, when you'd go to thecart, the website wouldn't know
that you'd added the product toyour cart, and so it wouldn't be

(02:44):
a very satisfying experience.
So a cookie is just a text filewith an identifier in it that
identifies an individual user,and the example I talked about
was a first-party cookie,meaning a website puts a text
file on your computer toremember who you are.
So what's more controversial iswhat's called a third-party

(03:04):
cookie, and with a third-partycookie, the third party refers
to vendors that the websiteworks with to do work like
target advertising, and in thatcase, the browser is interacting
with a third-party domain ownedby a company like Google.
That creates a third-partycookie, and that identifier
still has that same idea ofhaving a memory to the internet,

(03:27):
but now it allows thesedifferent vendors to connect
your behavior across websitesand, critically, what that
allows them to do and whattransforms digital marketing is
it allows them to connecteyeballs to wallets so it can
see who sees an ad and then whosubsequently purchases.

J.P. Matychak (03:43):
So it sounds like for businesses, this has been.
This was probably one of thebigger innovations in digital
marketing and e-commerce to beable to track customers and be
able to pull them back in.
So why are cookies going away?

(04:04):
I mean, you said some of themare controversial with third
parties and whatnot, so why arethey going away?
And why are companies who Iimagine have profited, like
Google, why are they making themove to eliminate the use of
them?

Garrett Johnson (04:18):
Yeah, I think that there's a sea change in how
people are thinking aboutprivacy.
Part of it's coming from largegovernment regulations like the
GDPR in Europe.

J.P. Matychak (04:27):
And the GDPR is.

Garrett Johnson (04:29):
Sorry, the General Data Protection
Regulation in Europe, which isjust a kind of generational
change in privacy regulation.
It's quite an all-encompassingregulation and just public
sentiment shifting and caringmore about privacy and maybe
being more negative towards thetech sector.
And so these combined forcesmean that the large browser

(04:50):
vendors have started to take itupon themselves to get rid of
cookies, and so actually, chromeis the last major browser to
still have cookies.
Safari already blocksthird-party cookies, so just
Firefox.
And what's different aboutChrome is that Chrome is saying,
okay, if we're going to get ridof these things, you know,
cookies do actually have a lotof redeeming features.
Let's try to create replacementtechnologies before we just get

(05:14):
rid of them.
Interesting.

J.P. Matychak (05:19):
So let's talk about the privacy sandbox.
You know we mentioned at thetop of the show that Google's
been testing this privacysandbox, and you also mentioned
some other replacementtechnologies for this.
So let's talk a little bitabout.
What is this privacy sandbox?
How does it work?
What are some of the othertools that maybe some of the
other browsers are working withas replacements?
Talk a little bit more aboutthis.

Garrett Johnson (05:40):
Yeah, so privacy sandbox is a collection
of technologies and their goalis to preserve the benefits of
cross-site identity that you getfrom third-party cookies while
offering superior privacyprotection to users.
So there's a few parts that'sworth unpacking there.
One is that it's a collectionof technologies, so right now,
third-party cookies are a verysimple technology, but they

(06:03):
allow many use cases, like theability to target ads, to
measure ads, to reduce fraud andjust track user behavior and
visits to websites, for instance, and the benefits is that
people get more useful ads.
For them, publishers get morerevenue.

(06:23):
Our research suggests thatpublishers get about double the
revenue when they havethird-party cookies, so it's
very valuable to publishers andusers get free content and more
useful ads.
Now the privacy protection isinteresting because consumers
will still be seeing similar adsto what they're seeing now,

(06:44):
targeted based on the websitethey're browsing.
But it's going to be hard forany one company or government to
put together user behavioracross websites in a way that it
is potentially possible today.

Shannon Light (07:02):
I know I might be jumping ahead here, I know.

Garrett Johnson (07:16):
I might be jumping ahead here, but I am
very curious to know what doesdigital marketing really look
like, digital advertising looklike without these cookies?
Yeah, so I think that there'sthree main things.
Obviously, privacy Sandbox is abig or sorry four main things
that kind of replace third-partycookies.
So one main thing is privacysandbox.
Another is falling back tocontextual targeting.
So if an advertiser doesn'tknow anything about you, they
still know that you're on like afinance-related website, and
they can show you afinance-related website sorry, a

(07:36):
finance-related ad.
Certainly, the largest companiesin this space will continue to
be dominant.
So the Facebooks, the Googles,the Amazons of the world.
They maintain a lot of dataabout you that they own
themselves, and so that putsthem at a relative advantage,
and because these cross-siteidentifiers are so valuable to

(07:58):
advertising companies.
Basically, if you get rid ofone technology, then
historically like, somethingthat looks very much like that
takes its place.
So one way, one shape that thattakes is that increasingly, you
get websites asking you to login when you visit their websites
and what's going on is thatthey're using your email address
, they're encrypting it andusing that email encrypted email

(08:21):
address as your identifier inplace of the third-party cookies
.
There's also kind of moresurreptitious ways of doing that
, called fingerprinting, thattries to use information about
your browser, like your IPaddress, in order to identify
who you are.
So yeah, if you kind of get ridof these technologies, life
finds a way.

Shannon Light (08:41):
It's very valuable to advertisers and so
it kind of comes back to lifeand through all of that putting
in your email address Like it'svery valuable to advertisers and
so it kind of comes back tolife and through all of that
putting in your email address or, like you said, even tracking
IP address, that then givesmarketers there the ability to
track geographically and targetthem geographically.

Garrett Johnson (09:00):
Well, track them as an individual.
Well, track them as anindividual.
So if you are visiting a Nikewebsite and you buy Nike shoes
and then you go visit Yahoo,where you're a logged-in user,
then Nike can find you at Yahooand show you a Nike ad to you.

(09:22):
But in order for this to work,you go away from kind of a more
permissionless system where thecookies are kind of being
installed on people's browsers,at least at the beginning,
without any interaction fromthem at all.
Now you have to consciously login and you have to log in on
both sides in order to make thisconnection and so, effectively,
what this does is that, like,it reduces the amount of free
flow of information, it reducesthe scale of this kind of

(09:45):
targeting and, you know, itcreates more control over this.
So it's less kind ofpermissionless, but it also
makes it so it's harder to enterthe space, especially if you're
an advertiser or a publisherthat doesn't have a lot of
relationships with yourcustomers, where you get email
addresses.

J.P. Matychak (10:04):
So, tess, your part in this has been a lot
about data privacy and datasecurity.
You know, an impetus to a lotof these changes has been, as
Garrett said, the shiftingsentiment around about data
privacy and people just becomingmore aware of data privacy.
So how has data privacy issuesand we hear about breaches all

(10:28):
the time.
I mean, it seems like I get anew email every single day about
another company that you knowhad a breach and lost my
information.
How is this impacting the waythe world is changing in the way
of cookies and other ways oftracking consumer information
and dispersing that information?

Tesary Lin (10:48):
Yeah, that's a very good question.
I think 10 years back it'squite plausible that consumers
are, by and large, not aware ofcompanies' tracking practices.
But given the recentdevelopment of privacy
regulations and also all thechanges from Apple and from
Google, consumers areincreasingly aware of other data

(11:11):
collection and usage practices.
Now that doesn't necessarilymean that, while consumers are
not willing to share so in myown research consumers some of
the time they are willing toshare their data to some of the
companies, but it is true thatwhile they are more

(11:31):
discriminating towards whatcompanies, they are willing to
share.
So, something that we have donerecently, we have run a
large-scale field experimentwith Andre Fraken here in
Kostrom and also Chiara fromNATO in Harvard.
So basically what we did is totry to look at the influx of
consent banners, kind oftriggered by the GDPR.

(11:53):
What does that do to, kind ofyou know, to the consumers in
terms of their willingness toshare their data with websites?

J.P. Matychak (12:00):
And so those consent banners sorry, are those
little banners I talked aboutat the top right, where it's
like you get to the website andit says, hey, this uses cookies
and are you giving us permissionto use those?
So when you say consent banners, that's what we're talking
about, exactly.
Great, okay.

Tesary Lin (12:17):
Exactly so basically , we run an experiment and
trying to see when the consumersinteract with these websites
and when they see a banner,what's the percentage of the
time they are willing to sharetheir data with the company, and
the baseline sharing rate isaround 60%.
Now, 60% is actually higherthan what we would expect,

(12:41):
because if you actually comparethe result with what people have
been reported from after Applepushed their own consent banners
on iOS platforms, the steadyrate is around 27%.
So this is actually higher thanwhat we expect, but this is not
100% and I think most of thetime what advertisers want to

(13:02):
see is 100%.
Now, this is also the baselinerate when the website or company
doesn't do any optimization tohow they elicit data from the
consumers.
Depending on the specific userinterface design, the concern
rate can go up to, say, around80.
Or, in some of the designs thatwe have tested, the probability

(13:25):
of rejecting all cookies couldgo down to almost 5 percent.

J.P. Matychak (13:29):
Wow, what are some of those changes that
impact that?
You said you tested differentdesigns, so what makes someone
more likely to say yes or optinto there?

Tesary Lin (13:40):
Yeah, so we have tested different types of visual
designs and this is somethingthat, well, if you look at the
news, sometimes they would referto it as dark patterns.
So these are different visualelements that would nudge you
know, nudge consumers into kindof certain actions.
So the specific design patternsthat we have tested, one of

(14:04):
them is hiding the specificaction from the main user
interface.
For instance, if I'm a website,I don't want the consumers to
click the reject all cookies.
So if we do that, then that isgoing to be, kind of, among all
the basic design elements, thatis the one that is more

(14:25):
effective.
It tends to increase the kindof like concern rate by around,
I think, 10%.
And then another design well,another design element that is
effective is default.
So if you set the default assharing all the cookies, then
it's going to be effective innudging consumers into sharing.
Now, default is kind of trickyfor the website to use because,

(14:48):
if you look at all the recentimplementation of GDPR, a lot of
the countries are actuallybanning the use of defaults.
So something that we have seenis that companies are actually
shifting alternative designpatterns in order to nudge
consumers into sharing.
Now, on the flip side someother patterns that we have been

(15:10):
testing, something that onlychanges the visual elements, for
instance, kind of graying outcertain options or re-ranking
the options.
For example, if I ranked thesub-all to the top versus at the
bottom, that doesn't actuallychange the privacy choices that
much.
So.

(15:32):
Kind of the takeaway is that ifyou only changes the pure
visual element, that is notactually going to affect privacy
choices by a lot, at least in2024.
There may be an element of,well, consumers getting
acclimated and kind of get usedto different design elements.
On the other hand, designs thatactually make it harder and

(15:57):
more difficult and more timeconsuming to perform a certain
action, those remain effective.

Shannon Light (16:04):
To your point about one of the more effective
ways being actually making italmost hidden the consent
banners.
I'm just curious how is thatallowed?

Tesary Lin (16:20):
That is a good question.
I think the well.
I'm not a law schooler.
My best explanation of thereason that it is currently
quote unquote allowed is that,well, the regulators are still
not catching up to the evolvingbanner design practices and it's

(16:41):
more or less a catch and seekgame in the sense that, well,
the regulators are playing thiswhack-a-mole and they are trying
to catch the most manipulativepatterns that work.
But given the fact that mostcompanies are having all you
know, they have all theinfrastructures to test and
optimize their consent banners,it is very plausible for them to

(17:04):
find the next banner designthat works.

Garrett Johnson (17:08):
I'd like to jump in on that a little bit.
I think this is a symptom ofthe really troubled relationship
that we have as a society withprivacy, because we want to live
in the data economy and we geta lot of benefits from that, but
we also want to maximize ourprivacy, and so the GDPR, in
particular in Europe, is pushinga view of privacy that makes it

(17:32):
so that people should beproviding opt-in consent and, as
you say, it shouldn't bepossible to hide your banners.
So why does these sort ofthings persist, at least in some
parts of the EU, like six yearsafter the regulation was put
into place?
I think it's because it'sactually extremely valuable to
websites to be able to collectpeople's consent and thereby

(17:53):
monetize their ad impressions.
To the fact of doubling theirad impressions, to the fact of
like doubling their adimpressions and, um, I think
what's challenging here I meantelstra makes a good point about
the technical challenges here,but, like I think a big
challenge from a regulatoryregulator perspective is that
you have the mandate from thepublic to increase people's
privacy.
You don't have the mandate fromthe public to go and shake down

(18:14):
websites and like, reduce theirrevenue by a factor of like 50%,
and so this creates this sortof tension.
That means that some regulatorshave been more aggressive in
pushing designs that Tesserypoints out would increase or
decrease consent rates, butothers have just kind of backed
off, and I think this is areally challenging part of this.

J.P. Matychak (18:47):
Do either of you see a world in which this type
of consent would be standardized, in the sense of how it's
implemented, or do you thinkthat that's just a bridge too
far for many regulators to sortof mandate what technologies
you're going to use?
The reason why I ask is I foundit interesting that on some
websites I can't do a thinguntil I acknowledge that banner

(19:09):
and others I can scroll away andnever have to do anything with
that banner, and so it just goesto your point where people are
still testing and some have ithidden and it's like, as long as
you have it somewhere and Idon't have to give you a banner.
So do you see any type offuture where that's standardized
and what would be expected?

Tesary Lin (19:31):
I see some version of the standardization floating
out as a proposal.
So, in particular, well, thismay well I don't actually know
whether this would count asstandardization but one thing
that the regulators have beendiscussing as a way to kind of

(19:52):
reduce the impact of, well, kindof non-standardized banners is
what they call global privacycontrol.
Now, what is global privacycontrol?
It's basically saying that,well, I want to give the users,
the individual consumers, theoption to turn on or off the

(20:13):
cookie sharing at the browserlevel.
So that is the sense of whichit's called global.
Now, the current version ofglobal privacy control still
says well, if a user turns offthe privacy choices at the
browser level, but laterindividual websites ask for that

(20:34):
and get consent, they can stilloverride that.
So the devil is actually in thedetails, because, well, imagine
the situation where a consumerturns off the cookie sharing at
the browser level Individualwebsites they will have the
incentive to continue naggingthe consumers to share their
data.
Going back to Garrett's point,well, websites have the
incentive to collect more data.

(20:54):
So it is a bit hard for me tosee how that whole kind of
proposal, once it's implemented,is going to play out.
But that has been kind ofsomething that has been
discussed a lot by theregulators.

Garrett Johnson (21:09):
Yeah, I agree.
I think it's a salient changeto the internet we've seen from
these privacy regulations, and Ithink for many of us it's not a
good experience to have thesethings pop up in our face all
the time, and so Tesserymentioned the global privacy
control.
This is actually something thatthe California privacy law has
explicitly pushed for, and theregulators are actually going

(21:29):
after companies that aren'trespecting that, so we might see
more of that.
So I think the challenge,though, is that, like
standardizing these consentprocesses could probably be good
for consumers, but I think thevery nice finding that you get
from Tessery's research is thatexactly how you set that up that
decision and how you frame thatdecision is going to massively

(21:51):
affect the consent rates, and soI think the scary thing for
stakeholders here is who setsthese defaults.
Is it firms, is it browsers?
Is it government?
Because that's going to beenormously consequential.

J.P. Matychak (22:08):
And there's even variation.
Sorry, I know you're about toask, but I want to follow up on
something, especially sincethere's gaps or a lack of
reconciliation in some of theregulation.
As you mentioned, gdpr, I mean,that's the regulations in
Europe, right, and we're notnecessarily beholden to those

(22:29):
same regulations here in the US,unless you've got workings in
the EU and other places overseas, in the EU and other places
overseas.
I imagine that for manycompanies, especially global
ones, this is difficult to kindof reconcile regulations here
abroad, other places.

(22:50):
Are you seeing a call for justhey, let's just have one
universal privacy protectionkind of regulation for Internet
privacy, or do you think thatthat's something that people are
just going to avoid?
Best guess or what you've beenhearing?

Garrett Johnson (23:11):
I think that, from a firm's perspective
complying with a Europeanprivacy law, we're getting up to
like 15 different US state lawsdifferent laws throughout the
world gets to be a substantialheadache, and so there's a
little bit of a kind of highestcommon denominator effect of
just comply with the strictestpossible law and then just set

(23:34):
it and forget it and be donewith it.
The strictest possible law andthen just set it and forget it
and be done with it.
And that's maybe why you startto see these more
GDPR-consistent consent bannerson websites that really aren't
interacting with EU users.
So I mean one thing to yourquestion about what comes in the
future.
One potential thing you couldsee is a federal privacy law in

(23:55):
the US that would start to levelset things, but in the meantime
, these companies are justbasically going through the full
tree of like.
If a user's coming from thisplace, then this is what we're
going to show them and these arethe standards that we're going
to apply, trying to make it asconsistent as possible, but
that's.
It's created a business forsome companies.

Tesary Lin (24:13):
Yeah, I think, on the regulation side.
So both Garrett and I have beenfollowing privacy regulations
for a long time.
I think something that we seeis that, well, there are
willingness to push for afederal-level comprehensive
privacy law that is standardized, but the proposal has received
a lot of pushback because, forinstance, places like California

(24:40):
they want to stick to theirversion of the privacy law,
which is more restrictive andmore privacy protective, and
most of the privacy bills at thefederal level they are much
lenient than that.
Therefore, there is thistension between individual
states wanting to kind ofimplement their own version of
privacy law versus kind of thefederal level.

(25:02):
They want to kind of makeeverything standardized.
So, in practice, my personalconjecture is that, well, if you
want to see a version of thefuture where we have a
standardized version of privacyregulation, that is going to be
pretty hard, even within thenational level.
If you want to see somethingthat is standardized across the

(25:23):
continent, that is going to beeven harder.

Shannon Light (25:26):
Yeah, I mean, with the rise of OpenAI, ChatGPT
, I always wonder the amount ofinformation people are putting
into those types of platforms.
What is the regulation aroundthat?
And I know that it's spittingback out the knowledge it has

(25:48):
from being built, but can youexplain kind of the regulations
around a platform likeartificial intelligence?
Who's that going to be?

Tesary Lin (26:00):
So I'm not actually aware of regulations that
specifically target kind ofgenerative AI and these large
language models so far.
There might be something thatgets proposed later.
There might be something thatgets proposed later.
But what I have been seeing isthat, at least if you look at

(26:23):
the company users, they are veryconcerned about it and they
have been putting variousrestrictions preventing their
employees from using theseplatforms.
So I think this is a veryinteresting example, because
here you actually see that well,if you look at this specific
product, where the users areactually companies, they are

(26:44):
putting very active measures tokind of protect the company's
privacy, which kind of you knowconcretely means intellectual
property rights or other typesof business secrets.

J.P. Matychak (27:09):
It seems to me that you know, we've I think I
mentioned off air we had aconversation around, you know,
social media platforms and theirresponsibility for the
editorial content and peoplebeing censored and whatnot, and
it seems like that was the nextwave of regulation and policy
that we were just going to haveto deal with.

(27:30):
It seems like this one is thenext wave, like the data
protection, the privacy Again.
As we said.
It seems like this one is thenext wave, like the data
protection, the privacy Again.
As we said, it seems like everyday there's just a new story.
Look out to the future.

(28:01):
We, of course, know where we arenow.
Where do you see, maybe, thebig red flags of what we're
going to have to deal with next,either as regulatory bodies or
as a society, when it comes todata protection over the next
five to 10 years?
You know what are the thingsthat are just like what aren't
we talking about?
Yet that, as you've done yourresearch and you've looked at
these things, you're saying likeboy, you know what this is.

(28:22):
If we don't resolve this, we'regonna have, we're gonna see
some issues.

Garrett Johnson (28:28):
I think there's .
We still have this longstandingtension between you know again
wanting to live in the dataeconomy and getting the value
that's created by data.
We're a business school.
We teach analytics to ourstudents.
It's a huge part of how modernbusiness works, but of course,
we do want to improve privacy atthe same time, and I think that

(28:48):
the regulations that have triedto do so, like the GDPR, have
had some substantial downsidesto the data-driven economy, and
one thing in particular is aharm to competition, and so I
think we have to wrestle withyou know, how do we create a
better privacy regulation thatbalances those two goals?

(29:10):
And I think one thing that'sreally interesting that Privacy
Sandbox speaks to is thatthere's new technologies
developed by computer scientistscalled privacy enhancing
technologies, and what they tryto do is they kind of try to let
you have your cake and eat ittoo, like try to have privacy
but also allow data to be usedin ways that creates value, and
I don't see maybe enoughregulators thinking about that

(29:34):
very important issue, becauseyou know, it's kind of a weird
technology.
We talked about AI.
Ai is going to diffuse itselfbecause everybody wants to use
this great technology.
Privacy enhancing technologiesthey're kind of worse for firms.
So if you want them to diffuse,then you need to create some

(29:58):
social incentive for thesetechnologies to actually be
taken up by firms.
And one thing that I am alittle bit sad to see is that,
with so much focus on cookies,in some sense we're focusing on
solving like last year'sproblems, or the problems of the
last two decades, and notthinking about okay, well,
there's the possibility ofreplacing them with a host of
technologies like those providedby Google or proposed by Apple,

(30:20):
or those proposed by Microsoft.
You know, what are we as asociety going to do to try to
maybe make that third way, withall of its trade-offs, something
we should be considering aswell.

J.P. Matychak (30:34):
Well, it'll certainly be interesting as we
continue to navigate all ofthese things where people want
their how to change it, theircookies, and they want to eat it
too.
Right, there you go.
I had to bring it all fullcircle.
Well, it'll be, oh the rim shotearlier.

(30:56):
So, Garrett Tessarie, thank youso much for joining us and
helping us make sense of thisall.
We really appreciate you comingon the show with us today.

Tesary Lin (31:07):
Thank you for having us.

J.P. Matychak (31:08):
Great Well, that'll wrap things up for this
episode of the Insights atQuestrom podcast.
I'd like to thank our guestsagain Garrett Johnson, associate
Professor of Marketing.
Tessery Lin, assistantProfessor of Marketing at
Questrom School of Business.
Remember for more informationon this episode and previous

(31:33):
episodes, along with otherinsights from Questrom School of
Business experts, visit us atinsightsbuedu For Shannon Light.
I'm JP Matichak.
So long.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.