Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Welcome to Stuff to Blow Your Mind, the production of iHeartRadio.
Speaker 2 (00:12):
Hey you welcome to Stuff to Blow your Mind. My
name is Robert.
Speaker 3 (00:15):
Lamb and I am Joe McCormick, and we're back with
part two in our series called The Devil You Know,
which is about ambiguity aversion. That's the observation that we
prefer risks with known probabilities over risks with unknown probabilities,
even if you have no reason for thinking that the
(00:35):
unknown risks will be worse. This preference is captured by
the folk saying, better the devil you know than the
devil you don't. For the most part, people don't like
taking bets when they don't know what the odds are,
and will pay up just to avoid having to deal
with ambiguity. And this is funny in a way because
(00:55):
almost all of the decisions that we make in our
actual lives are made with incomplete knowledge and some significant
amount of uncertainty about our likelihood of success. Very few
things in reality have clear objective odds, like a fair
coin flip or a dice roll. However, it's pretty well
established that most people, most of the time, do not
(01:17):
like making decisions under ambiguous conditions, and we will go
to great lengths to avoid taking risks with unknown probabilities.
So in the last episode, after we introduced the concept,
we talked a good bit about the original piece of
writing that made ambiguity aversion famous. This was a nineteen
sixty one paper by the American economist, anti war activist,
(01:40):
and whistleblower Daniel Ellsberg. The paper was called Risk Ambiguity
and the Savage Axioms, published in the Quarterly Journal of Economics,
and to briefly recap, this paper proposed the concept of
ambiguity aversion using a number of thought experiments where you
could take different bets based on like which color ball
(02:01):
you would draw by chance out of an urn. Elsberg's
point was that people would tend to prefer bets with
clear odds of winning, say a one in three chance,
over bets with unknown odds, where you could have anywhere
between a zero percent chance or a two out of
three chance. Elsberg's intuition here has been broadly supported by
(02:22):
real world experiments. There are a few exceptions, but most
of the time most people would rather take bets with
clear odds, even in ways that end up implying self
contradictory assumptions about the unknown odds. This was an important
discovery in the economic field known as decision theory because
it violated a framework known as Savage's axioms, which were
(02:45):
widely used to model how people make decisions in situations
of uncertainty or ambiguity, and this self contradictory behavior is
now known as the Ellsberg paradox. And then, finally, in
the last episode, after talking about the theoretical origins of
ambiguity aversion, we also talked about this phenomenon in practice
(03:06):
in the real world, and our main example here was
the negotiation strategy famously associated with the Nixon White House
known as madman theory, a strategy where you intentionally cultivate
a reputation for volatility and unpredictability. In essence, you make
your peers and your counterparties worry that you are a
(03:28):
madman who is capable of anything, and the idea here
we ended up talking about a lot of reasons for
thinking that this is actually not a good strategy, but
the idea here is that you will exploit their natural
ambiguity aversion to get them to make concessions that they
would not make otherwise, And then we did get into
some international relations research on Madman theory, like reasons why
(03:51):
it might not actually be a good strategy, especially for
achieving long term goals. In fact, I think in the
long term most experts seem to agree that it is
harmful to everyone involved, including the practitioner. But again, the
reason it may sometimes work for achieving short term wins
is by taking advantage of this fear people have of ambiguity,
the fact that a lot of people would rather just
(04:13):
take a bad deal than keep negotiating with somebody who
is highly unpredictable.
Speaker 2 (04:19):
You know, I was thinking after our recording that another
often cited bit of supposed wisdom along these lines is
the idea in generally you'll see this reference out of context,
so someone will not be talking about actually going to prison.
They'll be talking about your first dai of new school
(04:40):
or something, and they'll jokingly say, well, you know what
you're supposed to do. You're supposed to go up to
the biggest person around and hit them with a steel chair,
And then everyone will be like, WHOA, that guy's crazy.
We better not mess with him. It seems to follow
similar logic, and also seem equally, if not more flawed
in its approach. It does not sound like a tactic
(05:00):
that would actually generate like, you know, long term benefits.
Speaker 3 (05:04):
It's one of those pieces of advice that is probably
more often repeated because it's like memorable and funny and
makes for a good story because that many people would
think it's actually good advice, especially if you have relevant experience.
Speaker 2 (05:17):
I don't know.
Speaker 3 (05:18):
I mean, I don't know a lot about like prison strategy,
but that does seem like you could have some real downsides.
Speaker 2 (05:24):
You know, it makes me think of that the you know,
the the this sense another bit of wisdom, another nugget
that's Offen dished out, and that is getting into the
philosophy of Nietzsche. Uh, you know that which doesn't kill
us makes us stronger. But I forget who it was
who did a spin on this and said, well that
that which doesn't kill us nearly kills us, and that
(05:45):
should be a reason to give us pause.
Speaker 3 (05:48):
That's funny because that Nietzsche saying is also when I
know this has come up on the show before, I
know that like in huge number of cases that's not true. Yeah,
like sometimes it makes us stronger, like we do sometimes learn,
learn and grow and gain strength from adversity, but that's
like some subset of adversities. A lot of adversities leave
us much weaker and deeply shaken.
Speaker 2 (06:08):
Yeah, shaken, or less trusting or just yeah, or just
completely weakened. And I know this may weaken us. We
survive it, but it's weakened our system and then we're
more susceptible to the next thing and so forth. So yeah,
I mean, it's great to take solace in some of
these sayings, but you can only take them so far, really, So.
Speaker 3 (06:24):
Anyway, we're back today to talk about some more angles
on ambiguity aversion, and one of the first things I
want to get into is that in the last episode
we did talk a bit about the distinction that Daniel
Elsberg tried to make between ambiguity and risk. But I
think one thing that's really important to point out that
we didn't get into last time is the distinction between
(06:47):
ambiguity aversion and risk aversion. I think risk aversion is
a concept more people are probably familiar with, you know,
that comes up in conversation. Sometimes these are not the
same thing. One who is risk averse doesn't like to
make bets. They would prefer a sure thing, even if
(07:07):
the sure thing is statistically less valuable than the risk. So,
for example, imagine a fair coin flip game. We stipulate
that this is a fair coin. There's no cheating involved,
and on each round when I flip the coin, I
can either just give you forty dollars no matter how
the coin lands, or I could give you one hundred
(07:28):
dollars every time it lands heads and nothing every time
it lands tails.
Speaker 2 (07:33):
So with the.
Speaker 3 (07:33):
Gambling condition here, you will on average win fifty dollars
per round. It's a fifty to fifty chance of winning,
and the payout is one hundred dollars. So the odds
here are clear. If you play this game a thousand
times in a row, you will make more money by
gambling than by taking the guaranteed forty dollars every time.
It's statistically all but a sure thing. But still some
(07:56):
people would rather just take the actual sure thing. They
would rather just take the forty dollars, and that is
risk aversion. It is a psychological tendency to dislike and
avoid taking a stake in uncertain outcomes, even if you
know the exact odds of those outcomes and potentially even
if those odds will statistically bring you more benefit than
(08:18):
the sure thing. And another way of thinking about risk
aversion is that sometimes to some people, the feeling of
security that you get from taking the sure thing is
actually more valuable than the money. It's more valuable than
the extra money that you would take home on average
by placing a bet with a higher expected value. And
(08:39):
this emphasizes something that is true in decision theory and economics.
Value isn't just about getting or keeping money. We use
money because it's an easy way to quantify value and experiments.
But sometimes we would pay money just to feel certainty
and comfort. That feeling is of higher value to us
(08:59):
than the money is. But then, of course again, other
people at other times would not pay that. Some people
love the feeling of risk. I mean people gamble for pleasure.
Speaker 2 (09:08):
Yeah, absolutely, but yeah, but other times it's like I
just don't want to take a chance on this. I'd
rather just go for what's going to be straightforward and guarantee, right.
Speaker 3 (09:17):
So that's risk and risk aversion. Ambiguity a version is
different from risk aversion because it is not about a
dislike of risk, but specifically a desire to avoid ambiguous
risks with unclear odds. You might love gambling on a
fifty to fifty coin toss, but you might not want
to play the game where you have to guess how
many black balls or yellow balls there are and there's
(09:39):
no way to know. So a person could technically be
not very risk averse at all, but highly averse to
ambiguity the interaction. The more I thought about it, the
more I thought that the interaction between these two tendencies
in real life is interesting because I think to many
of us, it's not always if a fear or a
(10:02):
hesitancy we're experiencing about making a decision is rooted in
risk aversion or ambiguity aversion. For instance, this came up
because I was trying to think of examples of risk aversion,
and I thought of the idea of what to do
with savings. Imagine you save up a little bit of
extra money at work from your paycheck, and you want
(10:23):
to put that toward retirement, and you have multiple options
of what to do with that money. Imagine a person's
in that scenario and they discover that they would really
feel better just keeping the money in a bank account,
earning a relatively low rate of interest, rather than investing
it in like a stock market index fund, where with
the index fund, it is commonly assumed it is standard
(10:46):
financial wisdom that the balance will grow at a faster
rate than it will just sitting in a savings account,
and you will end up with more money years down
the road if you put it in the index fund.
This pattern of behavior could be taken as an example
of risk aversion, because it's like, I would choose to
probably end up with less money just so I can
(11:06):
avoid taking risks with what I have. But then I thought, well, no,
not necessarily. What if that same impulse is actually a
better example of ambiguity a version than risk aversion, Because
how do we actually know that a stock market index
fund will grow more over time than the balance of
(11:27):
a savings account. The answer is historical patterns. It's like
statistically true over the last one hundred years or so,
you know, within certain economies in living memory. But it's
not actually like a fixed odds casino game or a
lab experiment where the odds of winning are fair axiomatically
in the real world. As the fine print always says,
(11:48):
investing involves risk, including loss of principle, and past performance
does not guarantee future returns. So maybe actually preferring the
savings account is a form of ambiguity aversion rather than
risk aversion, responding to the idea that, well, what are
the odds that suddenly, somehow investing in the stock market
(12:10):
becomes a bad idea in a way that it has
not within living memory. It's hard to understand what the
odds of something like that are. Seems pretty unlikely, but
it's fundamentally ambiguous. You just can't say what the objective
odds are. There's really no way to calculate them. So
sometimes there's actually ambiguity about whether there is ambiguity, you know,
(12:33):
because like I think an invest an investment advisor might
tell you on one hand, like, well, you know, historical
performance of the stock market, you can pretty much depend
on that that's a known risk when you know you
have you have statistics that tell you that you'll most
likely get this rate of return on average, and you
just don't know how seriously to take the possibility that
(12:53):
the future will not be like the past.
Speaker 2 (12:55):
Yeah. Absolutely, And another angle that I was looking at
here on all this is that it's not just the
ambiguity and risk, but it's perceptions of ambiguity and risk.
So you know, it ultimately isn't necessarily coming down as
much to like what are actually the hard probabilities here.
It can skew greatly depending on what a particular individual's
worldview is. So you could have somebody like you could
(13:18):
have like a you know, a Washington Bartholomew hog Wallop figure,
and he's decided not only is he not going to
invest in the stock market, he's not even gonna put
his money in the bank. It's better off in the
tin can buried in his backyard.
Speaker 3 (13:31):
Yeah, there you go.
Speaker 2 (13:32):
Yeah, even though there may be very good statistics that
show that is less safe. Yeah, and that is a
greater risk. Your worldview and perception could be Oh no, absolutely,
you can't trust the banks at all. I can only
trust myself, and therefore it's safer in my yard with
me standing over it with a shotgun.
Speaker 3 (14:01):
But anyway, so this all brings us back to this
distinction between risk aversion and ambiguity aversion, and how difficult
it can be to tell the difference between the two,
Like when we feel aversion to making a bet of
some kind, which by way of analogy. That doesn't have
to mean literally a bet with money, but just any
important decision in our lives when we feel that, like fear, hesitancy,
(14:25):
or resistance. We often think of this as risk aversion,
but could it actually be more often ambiguity aversion That
we're responding subconsciously to the fact that we don't know
how to calculate the odds for the decision we're about
to make, and that makes us uncomfortable. Now, one of
the big things we left open after we introduce the
(14:46):
concept of ambiguity a version in the last episode is
how is decision theory supposed to make sense of this
finding of like the Elsberg paradox? And there has been
a great amount of inks billed in answering this question
in the fifty something years since Elsberg. I don't think
I can even attempt to cover the whole landscape of
responses here, so I want to be clear this will
(15:07):
not be a comprehensive technical survey of the ambiguity of
version literature. I'm just going to mention a couple of
broad trends and then a few ideas within each. So
within the responses there are two main categories of thought.
One assumes that ambiguity aversion is actually a component of
(15:27):
rational decision making, that it is, in a subjective sense
wise to avoid ambiguity even when you can't really prove
that the ambiguous bets will generally turn out bad. The
other main way of thinking is that ambiguity a version
is some type of error or cognitive bias. And here
(15:48):
I'm going to mention a couple of sources. The first
one I cited in our last episode. That is a
book chapter by Mark J. Maschina and Marciano Siniscalci called
Ambiguity and Ambiguity of Version in the Handbook of Economics
of Risk and Uncertainty published in twenty fourteen. And then
the second source I want to mention here is by
Nabil I al Najar and Jonathan Weinstein published in the
(16:13):
Journal of Economics and Philosophy called the Ambiguity of Version
Literature a Critical Assessment. This one is more in that
second category I mentioned. It's an approach arguing that ambiguity
of version is actually just a type of error basically.
So first I want to look at what are some
of the ways that ambiguity aversion and like the Elsberg
paradox might actually be a form of rational, self consistent
(16:36):
decision making. One theory here can be called the multiple
priors model, also known as max men expected utility max
men is max n max men expected utility or MEU.
Let's think about it like this. Imagine you are asked
to make a bet involving ambiguous odds, like we'll go
(16:58):
back to the balls in the urn example from the
first episode. So am I more likely to pull a
red ball or a black ball out of this urn?
And so to remember the details there? The three color
eurn experiment was that inside this urn you can't see inside,
you don't know what color you're reaching into grab, but
there are exactly thirty red balls and then sixty balls
(17:20):
that could be any mix of yellow and black. So
they could be zero yellow and sixty black, sixty yellow
and zero black, thirty and thirty, or anything in between.
You have no way of knowing the mix in advance.
In these situations, maximin expected utility theory would say that
instead of forming one consistent belief about the number of
(17:41):
black balls inside and then placing bets according to that prediction,
we allow ourselves to consider multiple different probabilities and we
apply those probabilities to different bets pessimistically. So essentially, for
each new bet that requires us to guess the number
of black balls, we ask ourselves what would be the
(18:02):
worst case scenario for me here, and then we assume
those worst case odds when considering the bet. So if
you're asked to choose between black and red, there could
be anywhere between zero and sixty black balls, you just
assume there are zero and you bet on red. I
tried to come up with a real life decision sequence
that might help the contradictory behavior under MEU makes sense.
(18:27):
So here's what I've got to imagine this. You're making
plans for the afternoon and you don't know what the
weather is going to be. It might rain, it might not.
Don't know how to calculate that. You were thinking about
maybe going out for a walk, but then again, considering
that it might rain, you don't want to be caught
out in a storm, so you decide instead to stay
home and do some indoor activities. At the same time,
(18:50):
you are also deciding what to do about the flowers
on your front porch. They're drooping and they need some water.
Now it might rain and then they would get all
the water they need from that, But then again it
might not rain, so you decide to make time to
fill up a watering can and water them manually. This
could appear inconsistent because you've made two different decisions, both
(19:13):
on the bet that it will rain in one case
and on the bet that it won't rain in the
other case. It's possible that that's just self contradictory, or
it could be perfectly self consistent and logical if you
are considering a range of different probabilities of rain for
the afternoon and then applying the most pessimistic one to
(19:34):
each decision rather than just assuming one probability that applies
to all decisions. Does that make sense?
Speaker 2 (19:40):
Yeah, yeah, I think so.
Speaker 3 (19:42):
And to be clear, there's no guarantee here that the
pessimistic assumptions will be correct, because again, the information environment
is truly ambiguous. You just can't know. That's what ambiguity is.
But considering multiple probabilities and acting according to the most
pessimistic one for each decision is theoretically a logical and
consistent strategy. So the authors that support this model argue
(20:06):
that it is actually a rational way. Ambiguity version might
produce the apparently self contradictory results we observe, like the
apparent contradiction comes from the fact that we're sort of
coming up with different predictions for each bet instead of
one prediction for the actual full state of affairs.
Speaker 2 (20:24):
Yeah. Though, of course, making all your decisions based on
worst case scenario is not a great way to live your.
Speaker 3 (20:29):
Life, right, And I'm going to come back to that
in a minute, because that is part of the critical assessment.
Another theory that would make the Elsberg paradox behavior rational
is what's known as show qu expected utility or CEU
and the show K. There. That's a French name. It's
cho q u e.
Speaker 2 (20:46):
T show k sounds delicious.
Speaker 3 (20:49):
Oh yeah, so sho K. Expected utility is the idea
that when we evaluate a bet and the odds are ambiguous,
we basically we do come up with a single internal
belief about probability, but it takes a weird form. Instead
of a probability, we use what the literature calls a capacity.
(21:10):
So most of the time when we're trying to judge
how likely something is to happen, together the likelihood of
that event and the likelihood of its negation of not
that event have to sum up to one hundred percent. Right,
So like under normal probability, if there is a one
third chance a ball will be read, there is a
two thirds chance the ball will not be read. These
(21:33):
complementary probabilities should always sum to one, or, if expressed
as a percentages up to one hundred percent. CEU theory
says that our internal representation of likelihoods for ambiguous events
is not like this. Instead, when the odds are ambiguous,
(21:53):
we operate on the basis of a capacity, which mathematically
incorporates our lack of confidence in the situation by representing
the chance of an outcome and its negation together in
a way that does not have to add up to
one hundred percent. And I apologize to the more informed people,
(22:13):
I'm skipping over some mathematical complexity here, but this is
the simplified version. Is if you take the example of
the three color eurn, a person might begin with the
objective information that you know thirty one third of the
balls are red, so you know you actually have a
one third chance that the ball you draws red, And
then for the yellow and black balls, instead of coming
(22:33):
up with with guesses that sum up to two thirds,
which would be the actual sort of objective probability. Whatever
the mix is, you pick some smaller probabilities, like you
add in a maybe a one fifth chance that the
ball is yellow and a one fifth chance that the
ball is black. Obviously, these these cannot be the real
objective odds. Again with the caveat on what are actually
(22:56):
objective odds, but they can't be the real They can't
represent reality because they don't add up to one and
there are no other colors in there. But again, the
values we come up with in these kinds of decision
making processes don't have to be perfect at predicting real outcomes,
and in fact they can't be because we are missing
information that is the ambiguity. The point is that this
(23:17):
is another self consistent way our minds could work while
producing the behaviors observed in these experiments, like the missing
percentage of likely outcomes and the calculation is missing to
reflect the level of ambiguity we feel about the bet.
So that's two main models, and there are other models
for making sense of the Elsberg paradox behavior in these
(23:38):
sorts of ways. But I think these give you a
general idea of how this strain of thinking works. The
other main way of thinking about ambiguity version is that
it is not actually rational, like you don't need to
find a way of making rational sense of it. Instead,
it is a form of error or cognitive bias. And
this is the thrust of the paper that I mentioned
(24:00):
earlier from two thousand and nine by on the jar
In Weinstein. The authors here argue in short, that ambiguity
a version is actually a misapplied heuristic. It meaning it's
a mental shortcut that might be truly useful in some scenarios,
for example, in situations where we need to be cautious
(24:20):
to avoid being scammed or manipulated, but this shortcut gets
mistakenly applied to scenarios where it is not helpful, and
misapplying the ambiguity version heuristic like this, they say, could
cause you to like make different bets about the same
thing when conditions are the same and no new information
(24:40):
has been learned, so you know, like that's producing this
contradictory or apparently irrational behavior, or it could cause you
to treat non non informative information as informative. They also
argue that relying on heuristics like pessimism coming back to
your comment earlier, Rob, Again, that's always assuming the worst
case scenario for ambiguous bets. If you actually do this,
(25:05):
if you rely on a pessimistic heuristic in real life,
this will systematically produce incorrect predictions and bad outcomes. It's
actually not a good strategy to follow. Failure to make
bets or decisions in the presence of ambiguity also causes
us to never gain useful information because a lot of
(25:25):
times we can only learn more about what the real
odds are by making decisions in ambiguous conditions. You know,
like past bets on ambiguity lead to more informed bets
with higher certainty in the future because the previous bets
and their outcomes are information gathering tools.
Speaker 2 (25:44):
Yeah, and of course we see this played out in
the simplified world of games. If you want to get
good at you know, particular card game or chess or
anything like that, you need to be prepared to be
beat a lot, you know. I mean, that's that's part
of learning what your odds are with a native in play.
Speaker 3 (26:02):
That's right. So this strain of thinking says, no, ambiguity
aversion is not a rational strategy for making decisions. Instead,
it's it's more like an emotional bias that causes us
to act on yes self contradictory beliefs and to make
bad decisions because we do not like the feeling of ambiguity.
And honestly, I'm not sure which strain of thinking is
(26:25):
more on track. I've been reading about these and I'm
not I don't know who's more convincing that like that
it's part of rationality overall, or that it's not. The
whole thing about ambiguity, of course, is that it's ambiguous
whether or not you should bet on it, so there
really is no objective answer, but I think there, I mean,
what we're instead looking for is what is the most
(26:50):
what is the most consistent way to explain the subjective behavior,
rather than like, who's actually right about you know, the
ambiguous outcomes because there's no way to know them in
advance by definition. I do think the critical assessment makes
a really good point about learning. You really do often
have to be bold and make decisions in the face
(27:10):
of ambiguity so that there will be more information and
less ambiguity in future decisions. That seems like a highly
salient point to me.
Speaker 2 (27:19):
Yeah, yeah, And would you say based on what you
were looking at that it sounds like ambiguity aversion is
itself adaptive in the grand scheme of things. Maybe it's
one of these situations like type one errors in cognition
false pos positives versus type two errors and cognition's false negatives,
(27:40):
where in sort of the hypothetical tiger's hiding in the
bush's scenario, this is absolutely adaptive. But then when taken
into other aspects of life, especially modern life and all
its complexity, it's not necessarily useful.
Speaker 3 (27:54):
Well, I mean that, I think you could think of
it that way, and that might be one way of
approaching the error theory, right, that it's a mental shortcut
that is useful in some scenarios, again, especially when you're
like trying to avoid being tricked or trying to avoid
being taken advantage of by somebody who has superior information
to you. But then you're misapplying it to scenarios where
(28:17):
that's not really the case. You're just like you're using
a defensive mental shortcut in a case where you don't
really need to, and it's causing you to act with
unnecessary levels of defense, which produce irrational behavior. And we've
talked about other ways that that can be true. So
I think that fits more with the error way of thinking.
I think the other way of thinking would just say that, yeah,
(28:38):
I mean this would probably assume it doesn't really comment
on this, but it would just assume probably that it's
adaptive because it is just part of our consistent decision
making toolkit, and it's you know, just the same way
that our brains decide that it's better to take a
bet with a ninety percent chance of winning than a
ten percent chance of winning. It also has this other function,
(29:01):
which is this ambiguity of version function.
Speaker 2 (29:04):
Yeah, and like any toolkit, you want to use multiple
tools depending on the actual situation. If you have an
expansive toolkit and it's the screwdriver every time, even when
you need to hammer a nail, yeah, that's probably not
going to work out too well. But when you got
to screw a screw in, yeah, you know you're golden,
assuming it's the right dude ad at the end and
(29:26):
not one of it's not the star.
Speaker 3 (29:28):
Do you go through the same one two process every
time you find a sharp screw out on the road
or on the sidewalk, where like I hit the same
two thoughts every time and be like, first of all,
it's like, ooh, pick it up, Glad somebody a child
didn't step on this or a car hit this with
their tire. And then the second thing is what did
this come out of? Was it holding something important together?
Speaker 2 (29:48):
Oh? I never do the second step. I'm I always
just stick at the first step, where I'm like, oh,
glad nobody stepped on that or got that in their tire.
I have saved the day. And then I carry on.
Speaker 3 (29:57):
If it's like near my house, I like, look at
my house, and I'm like, something on. I don't know.
Speaker 2 (30:05):
If it's in the house, definitely, and I guess if
it's in close proximity to the house, But if it's
in the road, I'm just assuming, you know, I don't know.
I just never think about it falling out of a
vehicle in a like in a dangerous capacity. But maybe
I should. Maybe that's an additional level of worry that
I should begin to employ.
Speaker 3 (30:23):
I didn't mean a vehicle specifically. I mean it might
be or a house or anything. There are all kinds
of things that really need to remain stuck together.
Speaker 2 (30:30):
Yeah, I guess I just assume it's like it's work vehicles.
They have surplus screws rolling around and they're just rolling
out under the road.
Speaker 3 (30:37):
Let's hope that's the case most of the time, but
I'm glad I could share that anxiety.
Speaker 2 (30:51):
All Right, a few other angles we want to touch
on here. This first one. I'm not going to go
in super deep on this one because this one is
another highly economic topic, but it's a There's what's known
as the competence effect. So we've already touched on the
basics of this a little bit, but this is the
term for the phenomenon by which we tend to be
more ambiguity a verse in fields where we feel we
(31:13):
lack experience or expertise. In economics and finance, this manifests
in investors who are more willing to invest in familiar
domestic stocks rather than complex foreign markets. And if you're
like me, that sentence is like, yeah, well, I'm uncomfortable
with either. I don't know my way, I don't know
one from the other, so I just mark me down
as ambiguity averse to either.
Speaker 3 (31:36):
But it seems to be making the point.
Speaker 2 (31:38):
Yeah, but this is apparently a thing. I believe this
was first looked at in a paper by Chip Heath
and Amos Tversky titled Preference and Competence in Choice under
Uncertainty published in the Journal of Risk and Uncertainty in
ninety one, and this found that people tended to stick
to gut instincts in a familiar area versus even against
(31:58):
a rational, diversified approach that gets into areas they're not
that familiar with.
Speaker 3 (32:03):
This rings true for me. I know I am more
ambiguity averse in domains of life that I don't understand
very well, which is funny because we have to make
decisions with ambiguous odds of success in both unfamiliar and
in familiar domains. I mean, ambiguity does not leave reality
(32:24):
just because you know what you're doing in a certain
knowledge domain or hobby area or something like that. Having experience.
You can have all the experience in the world and
still there are things where it's just like you don't
know what your odds are. So I think maybe the
explanation for this difference is that in the unfamiliar domains,
I'm more likely to worry that ambiguity is artificial and
(32:49):
that I am being tricked or manipulated by someone who
has superior knowledge of that domain, whereas in familiar situations,
understand what amount of information it is normal to have
and thus how much ambiguity is just natural and unavoidable,
and thus the ambiguity that is there becomes easier to embrace.
(33:12):
Does that make sense?
Speaker 2 (33:13):
Yeah? Yeah, And of course all this ends up coming
back to the topic of not only like what do
you know, but what do you not know? And what
do you think that you know that you actually don't know.
We've talked about this before on the show, and it
always reminds me of a line. I had to look
this up. I'd forgotten the source, but it was from
the movie Body Heat. You have a characters played by
(33:33):
Mickey Rourke. He says that, and this is a I'm
going to paraphrase the quote a little bit. He says,
anytime you try to commit a crime, they are about
fifty ways that you could mess up, and if you
can think of twenty five of them, you're a genius. Now.
I don't know if those numbers are exact, but I
like the spirit of the thing, you know, Like he's saying,
You're going into a situation and you have this great
(33:55):
degree of overconfidence. You don't realize all of the ways
you could potentially fail, and nobody can have perfect knowledge,
like you know, this is a quote that does recognize
that there will always be a certain amount of ambiguity.
But if you have expertise, that sort of cloud of
ambiguity is going to be smaller. It's not going to
go away completely, but it will be smaller.
Speaker 3 (34:17):
Well, I think yeah, and I think this.
Speaker 2 (34:19):
Yeah.
Speaker 3 (34:19):
The spirit of the quote is that like people often
go into a crime thinking they're betting on red, that
they're taking a known risk, when in fact they're actually
they're going into an ambiguous risk where they don't understand
how much risk there is.
Speaker 2 (34:33):
So anyway minor cinematic diversion, there now another higher stakes
area for the individual concerning ambiguity and ambiguity. Aversion is
how ambiguity often factors into the healthcare experience. I was
reading about this. So there are two broad categories within healthcare.
There's diagnostic ambiguity, this is where the underlying disease. There's
(34:57):
ambiguity surrounding the underlying disease and how it works. And
then there's therapeutic ambiguity, like how do we treat it
or even cure it, and I imagine a lot of
you out there have encountered examples of this. In these
different ambiguities can lead to rather opposite choices, But due
to ambiguity aversion, a patient might choose a treatment with
known but terrible side effects over one with the chance of, say,
(35:21):
a better outcome, but also unknown risks.
Speaker 3 (35:24):
Or because of ambiguity aversion, I think very often in
healthcare will gravitate towards someone who offers them false certainty
versus someone who is honestly communicating like unknowns about the
level of risk.
Speaker 2 (35:39):
Yeah, communication is key, and this becomes important in a
number of different ways. You know, is an impact of
course on public health and highlights the importance of science,
communication and healthcare, as well as the dangers of medical
misinformation and conspiracy thinking. For instance, in the context of vaccination,
ambiguity a version and can definitely take the form of
(36:02):
vaccine hesitancy. And I want to stress again here that
the ambiguity need only be perceived ambiguity. So if one
has already bought into such statements, says, well, no one
really knows how what these vaccines are, or no one
really knows how they work. Just assuming that those statements
are true, there's some large degree of truth to them,
(36:23):
then there may be well enough ambiguity subjective ambiguity in
place in the individual's mind to make them rather adverse
to this perceived ambiguity and more likely to choose options
that bring with them greater risk. But more of a
communicated idea of certainty, like dubious alternative preventative and treatment methods,
(36:45):
or just the idea of doing nothing at all.
Speaker 3 (36:47):
The body knows how to take care of itself, that
kind of thinking, Yeah, which sounds very certain when delivered
by a confident and charismatic speaker.
Speaker 2 (36:55):
Yeah. I was looking at a paper in Behavioral Medicine
from earlier this year, Psychological Correlates of ambiguity aversion in
the contest of COVID nineteen vaccination, And this was by
Somanovic at All, and they found four different major takeaways
here that I thought were interesting. They found that Americans
(37:16):
who perceived higher ambiguity about COVID nineteen vaccines reported lower
worry and lower perceived severity of COVID nineteen, which were
each associated with lower vaccination intentions and lower information seeking
about COVID nineteen vaccines.
Speaker 3 (37:31):
Okay, So if I'm understanding that, right, Americans who had
lower confidence in the efficacy and safety of COVID vaccines
were also for whatever reason, less worried about covid infection.
Speaker 2 (37:45):
Right. But at the same time, they were also just
they were obvious, And this is kind of a no brainer, right,
It would seemed to follow they would be less inclined
to have any intention of being vaccinated, and then would
be less inclined to seek out any information about said vaccinations.
Their mind at this point is already kind of made up.
Speaker 3 (38:02):
Well, I think that's right, But it's also interesting to
note like the counterintuitive ways forms that having your mind
made up in this way can take, because sometimes the
way that I think that is expressed is like sometimes
it's just a statement of like, oh, yeah, nobody knows,
nobody can know about the vaccines, and so it just
becomes like this infinitely and unsolvably dangerous thing out there
(38:25):
that you know, nobody you can never really have confidence
in your mind is made up about it, but it's
made up in a way that always just sort of
like holds it as an unresolved danger.
Speaker 2 (38:35):
Yeah, yeah, yeah, And I'm sure that the individuals on
that side of the argument might well make the counter
argument that like, okay, well, people who trust in the vaccine,
they're just they trust blindly. They've just made up their
mind and they're not going to listen to any of
the criticisms. And I mean, I disagree with that. But
on the other hand, you know, I feel like, speaking personally,
(38:57):
like I can't be like I'm not going to pretend
to be like doctor Robert all the time where I'm
making all of all of these I'm doing all my
own research on you know, like health and dental concerns.
Like ultimately, I have to trust in a professional who
knows what they're doing and is certified in what they're
doing and leave it to them to tell me what
(39:18):
I should do, because I don't want to have all
these decisions on me. I don't want to do my
own research on everything. I've got enough research to do
in and out during the week.
Speaker 3 (39:27):
Well, I mean about like medical matters. When people say
do your own research ninety nine point whatever percent of
the time, somebody who says that either is not doing
any research at all, and that just means go with
your gut feeling, or if they are doing research, it
just means doing very bad research. It means relying on
sources that you have no objective reason for thinking or
(39:49):
giving you good information, and probably are just confirming your priors.
Speaker 2 (39:53):
Yeah, like I choose to research my health topic by
picking up chariots of the gods.
Speaker 3 (39:59):
Yeah, I mean I would argue, of course, like doctors
and healthcare professionals can be wrong. That's possible, and it
happens all the time. But I think on average, you're
going to do a lot better just like listening to
doctors than like reading a Facebook post and then making
up your own mind about medicine.
Speaker 2 (40:15):
Yeah. So this particular paper, I'm going to breathe through
the other main bullet points. Basically, they came down to
people who perceived higher ambiguity about COVID nineteen vaccines. They
reported higher anger about COVID nineteen vaccines, which was associated
with lower perceived severity of COVID nineteen. They reported lower
happiness about the vaccines, which was associated with both lower
(40:36):
worry and lower perceived a severity of COVID nineteen. And
they also found that both Americans and Israelis who were
looked at in the study who perceived higher ambiguity about
COVID nineteen vaccines, reported lower feelings of relaxation from the
COVID nineteen vaccine, which was associated with lower perceived severity
of the illness itself.
Speaker 3 (40:53):
So you have these constellations of different tendencies acting together.
You've got like a lower confidence of lower trust in
a healthcare treatment that also seems to go along with
having a lower concern or lower level of seriousness about
the condition that that treatment is supposed to treat, less
(41:15):
emotional positivity about the idea of taking the treatment, and
all of this together. So it I mean, well, one
thing this sort of suggests to me is that, of
course this is obvious, but it just reminds us that
when we are making decisions, we do, to some extent
try to make rational decisions based on our beliefs, but
(41:37):
we're also just like highly emotional creatures, and our decision
making is deeply entangled with feelings we have about the
things we're making decisions about. You know, it's not really
possible for us to approach every decision in life as
just like purely like odds and benefit maximizing machines. Like
we have feelings about things and those feelings, whether we
(41:58):
want them to or not do, influence the way we
make decisions.
Speaker 2 (42:01):
Yeah, and they will, they will override reason and and
all of us, all of us, is susceptible to this
sort of thing. But yeah, another way that it's I
think it's interesting and dreadful to think about all of
this is that is that again, there's generally some degree
of ambiguity with both medical diagnognosis and medical treatment. And
(42:22):
I can say I can speak from experience, and again
most of you can can can relate to this as well.
It can be frustrating when you go to a doctor
and instead of a clear course of action that you
might want or a definite solution that you are seeking,
you instead get a certain amount of ambiguity about what
might be wrong and how it might be addressed. Because
at the end of the day, I think we all
want the Star Trek medical tricorder experience where someone's able
(42:46):
to just scan us, perfect scan, see what ails us,
and then tell us exactly what needs they need to
do to fix this, probably with another device that they
just kind of run up and down our body. But
even with today's technology, it's it's more complex than that.
In most cases, Meanwhile, where do we actually encounter this
lack of ambiguity in the messaging? Again, it comes generally
(43:07):
in different forms of alternative medicine, or in snake oils
or conspiracy thinking, because none of these voices ever says, look,
there's absolutely no evidence this works, but give it a shot,
or we can't know for sure if lizard people run
a secret government in Antarctica. But there's always plenty of
ambiguity in their evidence, of course, if supplied at all,
(43:29):
but you'll often find them maneuvering around that in their messaging,
which again is low on ambiguity and high uncertainty. And
like you mentioned earlier, when it comes to legitimate medical messaging,
it tends to communicate both probability and risk. You'll get
that list of fast talk side effects at the end
of your advertisement for an actual certified medication, And again
(43:53):
this is just the inherent uncertainty in even very modern medicine.
Another paper I looked at was twenty twenty two's Vaccine
Hesitancy and Cognitive Biases by Kassigliami at All, and this
(44:15):
points out that vaccination education strategies long followed a fact
based approach, but given our aversion to ambiguity, along with
bias and other heuristics. These various cognitive biases in messaging
have to be factored into the outreach, and of course
this gets into the complex way in which public health
and vaccine messaging have to attempt to counteract all the noise,
(44:38):
sometimes by leaning more on empathetic and audience based approaches
rather than fact based debate, because again, in many cases,
the actual medical facts have baked in ambiguity, while the
non scientific counter claims do not. You know, it's again,
this is absolutely frustrating and increasingly dangerous, an increasingly dangerous
(44:58):
part of our reality. But yeah, the voice saying that
the vaccine has microchips in it that will turn you
into robots, like, you know, there's no ambiguity to that statement.
They're generally, it's generally very they're very clear. Uh, they're
very certain in it. Whereas again, actual medical facts and
actual medicine based arguments are going to have some level
(45:22):
of ambiguity in their messaging.
Speaker 3 (45:24):
This is one of the things that makes communication about
science generally, but especially about something very high stakes like
like medical you know, medical science, medical treatments, makes it
so difficult just because you have to You have to
find this very difficult balance of like you you are
compelled ethically to be forthright and honest about what you know,
(45:46):
but you also have to wait to find a way
to present that truth in a way that will be
rhetorically effective and hit correctly. And uh, and that's so,
like you are balancing things that somebody who is not
bound by an obligation to be ethical and honest is
not bound like they can only focus on the rhetorical
part of the appeal, Like all they have to worry
(46:08):
about is what I'm saying exciting and charismatic and convincing.
They don't have the other side of the scale to balance.
Speaker 2 (46:16):
That's right, that's right. And obviously this easily spills over
into other areas, you know, pseudo archaeology, pseudo paleontology, other pseudosciences,
alternative history, ufology, and more, because, as we I think
really strive to drive home and related episodes on this show,
(46:36):
actual history and archaeology inherently contain ambiguity. Like all histories are,
to some degree or another, imperfect. Some are just more
imperfect than others. You know, you know, as much as
we know about certain periods of the past and certain
individuals and events. We still have gaps, we have disagreements
about interpretations, we have uncertainties, and then we I'm reminded
(46:58):
too of particularly Alan Moore's use of the the Cox snowflake.
That's koc h in nineteen eighty Nine's from Hell, his
graphic novel about Jack the Ripper. So, the Cox snowflake
is a fractal curve that was first described by Neil's
Fabian Helga von Kock in nineteen oh four. The idea
(47:20):
here is that it has a finite area but an
infinitely long perimeter, so lots of cremulations. And Moore's treatment
of this is that history has a shape with infinite
perimeter and finite area, a labyrinth of connections and repeated patterns,
but that we never fully comprehend. Ah.
Speaker 3 (47:42):
That's interesting. So if I understand that right, it sort
of explains why as you read more about history and
learn more about history, you can, by incorporating more and
more knowledge and holding it, you know, parallel in your mind,
you can have in some ways a stronger and realer
picture of the past. But also the the increasing informational
(48:04):
complexity it makes everything increasingly blurry as well, like you're
always finding it becomes harder to have it becomes harder
to have like causal through lines and clear narratives about
why history happens, because the more you know, the more
other causal factors you can consider when thinking about, like
(48:25):
why something happened in history. And ultimately it gets to
the point where it's so complex that it's like who
knows why anything happened?
Speaker 2 (48:32):
Right, right? And that was his point about like who
was Jack the Ripper? You know, you can we end
up with all of these just so many books, so
many theories, so much literature and writing and conspiracy theories
and so forth, and for the most part, it's all
within the confines of that snowflake. The actual answer is
outside the snowflake. You know, we can never we can
never actually determine it with any degree of accuracy, just
(48:56):
be conjecture. So you know. At the same time, conspiracy
thinking itself engages in something akin to the cock snowflake,
you know, because there's it's it's often an infinite search
for hidden details, whether those details are there or not.
But still the basic premise holds true. Actual history is complex.
While the narrative provided by conspiracy thinking is generally simple
(49:19):
and single track, leading to a singular force at the
center of the entire conspiracy. Like it will be. It
will be more akin to like why did this happen? Well,
it's this evil cabal or this you know evil you
know corporation leader and so forth, as opposed to well,
there are multiple societal factors and historical anomalies that play
(49:40):
into this particular reality. And again, more ambiguity in real life,
less ambiguity or no ambiguity at all in the conspiracy argument. Yeah,
comes back to it's like stocks. You know, I don't
know anything about stocks, but I know that when I
see that little advertisement at the bottom of a blog
where and there's a picture of like an old person smiling,
(50:03):
and it's like, this is a surefire bet stock that
so and so has just identified, It's like, I'm instantly
suspicious because it seems a little bit too good to
be true. It seems like there's not much ambiguity to
this particular suggestion would be a risky click. All right,
one more area I want to touch here, because I
was looking at a really recent paper on ambiguity a version. Okay,
(50:26):
here's a hypothetical scenario. You're considering a couple of vacation options,
and we'll assume for the purposes of this experiment that
the cost and other matters are equal. Here, one is
a trip to a family vacation destination that you've visited
numerous times before, and the other is an entirely new experience.
Maybe it's visiting a country you've never traveled to, or
something of that nature. And then, as you're making up
(50:50):
your mind, what do you do? You flip on the
news and surprise, surprise, it's all negative. It's all traveling.
Would this exposure to negative news influence your decision making
and push you toward the less ambiguous choice, or would
you still basically make your decision as if you would
have if you didn't watch the news at all.
Speaker 3 (51:12):
Well, I guess it could go multiple ways. But yeah,
I would tend to assume that increasing my anxiety would
make me more ambiguity averse, and that I would be
seeking more known odds and familiar experiences if my anxiety
levels are higher.
Speaker 2 (51:30):
Yeah, yeah, you know, prior to reading this paper, I
might have jumped at this thinking as well. I have
very rarely altered travel plan to do to something I
saw in the news. But I'll be the first to
admit that news items can rattle my resolve with pet
you know, they can make me more anxious. So I
would have guessed, yeah, negative news can surely push one
away from risk, away from ambiguity, and back into the familiar,
(51:52):
and more to the point, might be likely to do
so on the whole.
Speaker 3 (51:56):
Now, one of the only things I would think that
might push me back in the other direction and is
a finding we talked about in the previous episode where essentially,
if the known odds are bad, people's ambiguity aversion to
take the other bet lessons right. So, like in the
three color Earn experiment, if instead of thirty red balls
(52:18):
there are like five red balls, people at that point
become more likely to bet on black because it's like, well,
the odds of the thing you know about are terrible,
so why not take a gamble on something else? So
I wonder maybe if that could push you back the
other way. Like my intuition is that anxiety makes you
(52:40):
more ambiguity averse and just want to go with something
where the odds are clear, But maybe anxiety makes you
believe that Like essentially the odds of stasis are really bad,
and thus you might as well take a gamble on
something different.
Speaker 2 (52:55):
Yeah, yeah, you know we can. We also have to
have to ignow knowledge that news reporting can certainly change
the perceived odds of something happening. Like if you were
to watch a news network, a hypothetical news network that
was just twenty four hours coverage of snake bites, that's
all they covered. Anytime anywhere in the world someone's beitten
(53:17):
by a snake, they are there to cover it in full.
This would maybe give you a certainly a heightened awareness
of maybe the legitimate risks of snake bite, but also
maybe an inflated feeling about your personal risks of snake bite.
Speaker 3 (53:31):
Yeah, okay, so that's different in that what I was
just talking about was like just how general emotional affect
could affect the way you reason, but it could also
be content specific.
Speaker 2 (53:39):
You're right, yeah, But anyway, mostly what we're dealing here
with is the effect here. So a recent study from
Adelphi University looked into this. It was published in Frontiers
and Psychology and titled negative news exposure does not affect
risk or ambiguity a version. So their answer is pretty
much in the headline. There is often the case. Now,
(54:00):
this is another one of those small studies, I believe
the working they worked with groups here of like eighty
four and two hundred and twenty nine, so you know,
as always, more research is required. This is not something
you can just take to the bank, but it's an
interesting possibility. The authors point out that while studies have
found that negative effect can increase risk aversion and ambiguity
(54:21):
a version, we didn't know I quote these effects generalized
to more realistic negative stimuli such as watching the news.
Speaker 3 (54:30):
Oh okay, so maybe in previous studies it's been found
that negative emotions can increase ambiguity version and risk aversion,
and so you would just assume that stimuli like watching
the news would would also increase those versions. But maybe
it doesn't.
Speaker 2 (54:48):
Yeah, and you can see why this would be something
worth knowing. I mean, not only from a general standpoint
of understanding. You know, how people make choices based on
their mood and what influences their mood. That's certainly valid
and that's the main point of the research here, but
also you can imagine it being very a very interesting
topic for individuals buying advertisement time on say twenty four
(55:10):
hour news network, especially if they're say, if they want
to have an advertisement for some sort of adventure travel
or something of that nature. You know, Bungee cords, I
don't know, whatever you happen to be selling. So, in
this particular study, one group was exposed to negative news,
and in this case it was like news about a
car crash, and the other group was exposed to neutral news,
(55:32):
so not great news, not good news, not like a
real fluff piece, but just something about train schedules, something
nice and boring and man, maybe it's a little interesting.
And they found that although participants who watched negative news
reported a significant increase in negative effect, they did not
differ from the neutral news group in their risk or
ambiguity preferences.
Speaker 3 (55:53):
Interesting, Okay, so they said past studies had found negative
effect increases ambiguity of version. They showed people bad news
and it did increase negative affect, but did not increase
ambiguity version.
Speaker 2 (56:08):
Right, right, So I don't think it means go ahead
and consume all the negative news you want. It's not
going to impact you. Like, no, it can and will
impact you. But what the study seems to be suggesting
rather is that like sort of incidental or realistic consumption
of negative news is that that exposure is not going
(56:29):
to be enough of a force in and of itself
to move the scale and actually impact your decision making process. Again,
on the whole this is this would be a generalization,
and again it's based on you know, smaller study, and
there's so many factors on top of this, so you
could throw into the scenario. But both groups were equally
likely to choose a certain gamble over an ambiguous one
(56:51):
following the consumption of their media.
Speaker 3 (56:53):
Oh well, some just occurred to me. I have no
reason to think this is really true, because I'm sure
the authors would have done a pretty good job of
planning this out. But like, what if the control group
the train schedule news was actually pretty demoralizing. It's supposed
to be a neutral control, but actually that made people
equally ambiguity averse.
Speaker 2 (57:14):
How that would have been an interesting outcome? No, no, no,
But my understanding is that basically the normal parameters of
ambiguity aversion were in place, so people were still averse
to ambiguity, but in equal measure, whether they had just
learned about a car crash or about the exciting world
of train schedules.
Speaker 3 (57:33):
Yeah, so they basically just conformed roughly to Elsberg's predictions.
Speaker 2 (57:38):
Yes, exactly. But again, so many studies come out about this.
They're new ones all the time. It's just such a
fascinating way to sort of get in and crunch how
we make decisions in life based on how we deal
with ambiguity and the inherent ambiguity of life.
Speaker 3 (57:58):
Yeah. That again, I can't emphasize this enough, Like, because
these studies want to quantify the effect, they need to
use these like fixed games that include you know, certain
like certain odds and risk conditions where the odds are
very clear. But in real life you're dealing with basically
top to bottom ambiguity. It's real ambiguity all the time. Like, really,
(58:21):
rarely do we encounter decisions where our probability of success
is absolutely clear, it's clearer than others. Some decisions have
clearer odds than others, but you never really know what
your odds are.
Speaker 2 (58:35):
Yeah, I mean, that's one of the reasons we've talked
about this before and the show. Is one of the
reasons we like things like zombie apocalypse scenarios, not because
we necessarily like the idea of the world ending, or
of the dead coming back to life and trying to
eat our brains. But baked into those scenarios, there's often
like a simplification of how the world works, a removal
(58:55):
of some degree of the ambiguity. And there's often still
in any good thing, I think you're going to have
a contemplation of ambiguity as well. But in broad strokes,
you might have a scenario boiled down to like, Okay, well,
now it's the living versus the dead. Now it's the
clear good guys versus the bad guys. And what is
the answer? Well, it's always blasting it with a shotgun.
Speaker 3 (59:16):
Right, Isn't it interesting how in most stories, any major
change that happens to the characters is because of whatever
the struggle in the story is about. It's not like
totally random, out of nowhere exogynous events come in and
completely change the story. Occasionally that happens, and people, I think,
I think often find that quite interesting in storytelling because
(59:38):
it's pretty.
Speaker 2 (59:39):
Rare, you where the cause of the scenario is ambiguous.
Speaker 3 (59:43):
You mean, we're like a major change maybe comes in
the middle of the story and it has nothing to
do with what the main struggle or plot of the
story is just comes out of nowhere. Yeah, I can't
think of an example off the top of my head,
but I know there are some like this right in
with your example.
Speaker 2 (59:58):
All right, we're gonna gohea and close up this step. So,
but we'd love to hear from everyone out there. We
know that everyone has experience with ambiguity in life, judging it,
being adverse to it, rolling the dice anyway, and so forth.
So right in, we would love to hear from you.
Just a reminder that Stuff to Blow Your Mind is
primarily a science and culture podcast, with core episodes on
Tuesdays and Thursdays, short form episodes on Wednesdays and on Fridays.
(01:00:18):
We set aside most serious concerns to just talk about
a weird film on Weird House Cinema.
Speaker 3 (01:00:23):
Huge thanks as always to our excellent audio producer JJ Posway.
If you would like to get in touch with us
with feedback on this episode or any other, to suggest
a topic for the future, or just to say hello,
you can email us at contact at stuff to Blow
your Mind dot com.
Speaker 1 (01:00:44):
Stuff to Blow Your Mind is production of iHeartRadio. For
more podcasts from my Heart Radio, visit the iHeartRadio, app,
Apple podcasts, or wherever you listen to your favorite showstor