Risk and Loss Aversion

Share This Post

Risk and Loss Aversion

Under a risk-reward rubric, most people are averse to risk — even to the point where they will forego a coin toss for $1 at even money, playing liars poker for a dinner check, or drawing lots to determine who’s on top.  Most people’s decisions clearly demonstrate that they prefer a certain positive outcome over a range of outcomes, even if the expected values are identical.  On the other hand, when a loss is certain, a decision with outcomes having similar expected values will get a different response if the outcome is defined in terms of risk than it will when expressed in terms of loss.

Captain Rob is directing five hundred men in retreat.  If he directs them to go through the valley, there is a 20% chance they will be ambushed by an overwhelming force.  If he directs them to cross the river, they will with near certainty loose one hundred men to hypothermia.  Which path should the Captain choose?  While statistically equal, most will choose the odds rather than the certainty.  The odds can swing both ways – and psychological probability rarely conforms with statistical probability.  As an example, the lottery ticket in your pocket certainly has a much better chance of winning than the chance dictated by probability theory, simply by virtue of the fact that you own it.  We believe that we can manage and domesticate probability; we don’t have to accept it.

The biases demonstrated by our use of heuristics come under a number of different classes, of these, decision making biases, probability and belief biases, social biases, and memory error biases are the main categories.  The specific biases I have been able to identify are well over one hundred.  As financial professionals, the ones that get us in trouble are typically decision making and probability biases.

Following are some biases that I have seen produce real problems:

  • Authority bias is altering your perception of someone based upon the subjective opinion of another authority figure, boss, or expert.
  • Availability cascade is a self-reinforcing process where collective beliefs gain credibility through repetition. (Greater risk, terrorism or influenza?).
  • Confirmation bias is interpreting and manipulating data in ways that confirm what you already believe.  (Using data the way a drunk uses a lamppost – for support rather than illumination.)
  • Conservative bias is discounting the potential impact of new information and evidence. (Facts and technology change, but the decision maker does not).
  • Disregard of “regression toward the mean” is a bias where one expects exceptional results to continue.
  • Exposure effect is having an unwarranted opinion of a person or thing merely because you are familiar with them.
  • Eloquence and Manners bias appears when you conclude that someone knows what they are talking about simply because they are eloquent and well manned.  This bias also applies to those who are well dressed with a nice haircut.
  • Groupthink (or Bandwagon) bias occurs when you eschew due diligence and chose to act because everybody else is.
  • Halo effect is a bias where you allow one positive trait, such as fame or prior success in another discipline, to spill over in areas requiring a more dispassionate assessment. (Look at celebrities in politics and the marketing of consumer goods).
  • Illusion of control is a bias where you think you can control the outcome of events when you really have no control whatsoever.  (You increase your wager when the dice are in your control).
  • Ostrich bias is discounting the importance of obviously negative information. (red cheeks)
  • Professional bias is analyzing events primarily through the lens of your own profession, and not making a broader, more objective analysis.
  • System justification is a bias leading you to defend and bolster the status quo.
  • Zero Risk bias is a focus on reducing all risk, including relatively insignificant risks, as opposed to reducing manageable risk.

I could provide examples for each of these, but most people, as they read the definitions are already recalling examples from their own experience.

Even armed with this knowledge, we are not immune.  It’s known as “Blind spot”bias — a tendency not to acknowledge our own biases.

More To Explore