A Theory of Behavioral Finance – Assumption 1 – A Combination of Factors
Posted by Jason Apollo Voss on Nov 17, 2020 in Behavioral Bias Deep Dive, Blog | 0 commentsLast week I published A THEORY OF BEHAVIORAL FINANCE which was a high-level description of my theory. In that work I promised to explore each of my assumptions in greater depth. This week, therefore, I deepen the discussion around Assumption 1 that posited:
Human behavior is a complex combination of multiple factors that must be considered in total to glean insight, these primary factors are:
- Biological, with energy and time conservation being the drivers of these factors
- Psychological, with the level of self-awareness being the driver of these factors
- Sociological, with the level of social pressures and the level of self-awareness being the drivers of these factors
- Immediacy of decision making, with time horizon being the driver of these factors
I stated in my previous article that one of the things that makes my Theory unique is the recognition that multiple sciences must be invoked in order to explain something as complex as behavioral bias. Previous proposed models of behavioral bias have tried to describe too much with too little, in my opinion. But that is not the only thing missing from other attempts to explain behavioral bias.
Behavioral Bias Examples
To demonstrate that multiple scientific disciplines are needed to explain behavioral bias it is helpful to consider situations in which the evidence of the bias defies any one science’s ability to fully describe its effects. Biological factors tend to focus on things like energy conservation where quick instinctual thinking reduces the energy drain that is key to deep thought. In other words, survival of the quickest and fittest. Sociological reasons likely have a large overlap with biological ones. Namely, the reason I make personal sacrifices in a group setting is that I am more likely to survive if a part of a pack of people. However, psychological factors, so frequently a major part of behavioral biases, are also so frequently a part of too much deliberation/energy inefficiency.
Let us look at the classic behavioral biases one-by-one in brief to demonstrate that a more holistic view is necessary to fully explain their effects.
Here, people feel the pain of loss approximately twice as great as the pleasure from gain. Biology serves as a strong explanation for this bias. In the ancient world when confronted with uncertainty it was better to run first and think about what happened second, lest that sabre-tooth tiger eat you at the watering hole.
However, I have worked with firms that believe that they make most of their money by preserving capital, first, and by earning excess returns, second. In other words, they are risk averse, long-term compounding investors. I have also seen closeted risk-takers at these firms who, nonetheless, have become practiced at executing the group norms and their accompanying language. “Boss, I am pitching this stock because it has a wide moat, is liquid, and pays a dividend; all of which provides ample downside protection.” Here sociological factors are also at play. Stranger still, is that a desire to fit in can actually lead to behavioral changes on the part of staff. In fact, “a good cultural fit” is one of the things that many investment firms hire for. But is the fit real, or do people just know how to fake it, until they make themselves a good fit?
This bias strongly violates a purely biological explanation. What overconfidence seems to be preserving is not exclusively energy, but a person’s fragile ego, clearly a psychological factor. Further, if a person’s reputation within a group is that they are a risk-taker, then they are likely to prefer making decisions with little deliberation and with the puffery that is a hallmark of overconfidence, not because of energy conservation but because of reputation conservation. Again, there are likely closeted risk averse investors at event-driven hedge funds, at small cap growth shoppes, and at venture capital firms. The biological basis for overconfidence might be that sans this bias we might not venture out to forage for life-preserving food. That is, we need a kind of hope and confidence in our abilities to confront uncertainty and make decisions.
Here people look for evidence that affirms a belief that they hold, and discount evidence that does not. There is a biological case to be made here that treating situations as similar, which is at the heart of confirmation bias, allows for energy conservation because a person does not need to reconsider held beliefs, or new evidence. However, confirmation bias also psychologically allows us to say to a chief investment officer that our view of that credit still holds given the evidence we continue to examine. In other words, ego is being preserved again. Confirmation bias also defers having to admit mistakes, which is unpalatable to the ego, and in many social situations where we do not want to be seen as a faulty thinker.
Mass movement in a single group direction clearly has a psychological and sociological component. This explains fashion and music trends, among many things. A biological explanation though, is that when we move in a herd there is “strength in numbers.” However, a full unraveling of the factors that lead to herding cannot be had by looking at just one science.
Those anchored are stuck on a thought. Mainfestations include the first number uttered in a sequence, the dominant idea in a group discussion, the losses suffered in the Great Recession, and so on. Yes, it conserves energy to not consider other points of view – the biological explanation of anchoring – but it is also the case that “going with the flow” in a group discussion is sociological more beneficial. However, psychologically the benefit to anchoring is again most likely about ego protection. The reason I am waiting for that stock to get back up to its cost basis before selling is that I do not want to admit that I made a mistake because my self-image is that I am a capable investor.
Overemphasizing information that is easily within reach resource-wise is the hallmark of availability bias. Because something is easily available I prefer it. Clearly this is energy efficient from a biological point of view. However, it is also the case that when speaking to a reporter and asked to describe our investment process that our firm’s investment philosophy is likely to spring to mind and what we spoon feed to the media. This is true even if we have tweaked the firm’s investment process a bit to better deliver alpha for the product we manage. Despite this, we quote the firm’s investment process because we know this is the “correct” answer sociologically.
State changes are one of investing’s hardest problems to deal with. Here, our readily available mental models must adjust on the fly if we are to comprehend what has happened. Psychologically, this may be more difficult than intellectually. For example, if I am a value investor circa 2020, it is hard for me to defend my last decade’s worth of returns, so I invoke readily available stories about inflation being a likely outcome from too much monetary stimulus. Again, the factor involved here is psychological – I am defending my beloved investment philosophy with a readily available theory.
Making decisions based on stereotypes is energy efficient, though frequently these representations are incorrect. However, stereotypes also typically form within group settings. Hedge funds and investment banks only recruit from the “best” schools because they want only the “best” employees. The result is an overbiased sample based on sociological factors – we hire from there because that is where we come from, too – and psychological factors, too. Namely, we may have failed but it is because of hard luck, not because all of us were educated by the same professors spouting the same ideas that we also use as the basis for our thinking.
The two biases that map most neatly to solely biological factors are loss aversion and mental accounting. Due to the limitations of working memory and its inability to consider too many ideas simultaneously, it is certainly efficient biologically to parse different ideas into different categories and apply different decision rules to them. For example, apportioning your investment portfolio to a group of “room to grow” securities, “do no harm to my returns, likely to go sideways” securities, and “income paying” securities. Here, the money is all fungible, but we treat it differently because we have mentally accounted for the securities differently.
In summary, to fully explain the manifestations of behavioral bias we need to invoke biology, psychology, and sociology. Without these three sciences we end up leaving key descriptive aspects of bias out of our understanding. Next up, I consider other things missing from the current work on behavioral finance and its biases.
A Proposed System 3
Amos Tversky and Daniel Kahneman’s groundbreaking research was a significant portion of the wind in the sails of behavioral finance that got the boat moving in the early 1970s. More recently, Kahneman’s highly influential book, Thinking, Fast and Slow,[1]has pushed the ship further along its journey. But I believe the ship needs updating.
For those of you not familiar with Kahneman’s work he says that human decision-making is best summarized by two systems, System 1 and System 2. More specifically they are:
- System 1 is fast thinking, characterized by instinctual reactions, snap assessments of situations, subconscious thinking, and centered in the brain’s amygdala region.
- System 2 is slow thinking, characterized by intellectual responses, deep analysis, and centered in the brain’s pre-frontal cortex region.
This model, that many mistakenly believe was postulated by Daniel Kahneman,[2] is incomplete. Certain ways of thinking, like intuition, defy this oversimplification of the mind into two systems. Kahneman makes the mistake of using “intuition” as a simile for System 1 – a subject covered at length by me elsewhere. This is a mistake, for example, among the Oxford English Dictionary’s definitions of intuition is:
Immediate apprehension by the intellect alone; a particular act of such apprehension.[3]
Note the combination of “the intellect” – System 2 – and “immediate apprehension” – System 1 in the OED’s definition. I have proposed elsewhere that there is clearly a System 3 which is supported by neuroscientists that explore intuition/insight. For example, a meta-analysis[4] (i.e. a study of studies) by Sprugnoli, et al. (2017)[5] found the following neural correlates for intuition:
- A complex network composed of the anterior cingulate cortex, prefrontal and parietal lobes, claustrum, temporo-occipital regions, middle-temporal gyrus, and insula.
- Both hemispheres of the brain involved.
- In the left-hemisphere, regions active are: precentral gyrus, middle temporal gyrus, precuneus, cingulate gyrus, claustrum, middle occipital gyrus, uvula (inferior vermis – cerebellum) and insula.
- In the right-hemisphere, regions active are: superior frontal gyrus, insula, precuneus and middle temporal gyrus.
First, notice that Sprugnoli found intuition to be a network of interconnected brain activity, neither centered in the amygdala or pre-frontal cortex. Second, both hemispheres of the brain are involved, meaning that intuition is a whole-brained activity. Also, for those who research intuition they all note that insights spring into consciousness unannounced rather than after a slow deliberative process.
So, clearly there is a well-known and universal mental experience – intuition – that is not well described by System 1 and System 2. It is for this reason that I propose there is at least one other decision-making system: System 3.
Next up, a good model of the brain from a neuroscience point of view helps to explain the convergence of biology, psychology, and sociology.
A Model of the Brain
My discussion of Assumption 1 at its core relies on a neuroscience-based model of decision-making proposed by Vartanian and Mandel (2011).[6] Here is an overview of the model:
- To test whether the brain has serial and parallel processes working simultaneously in cognitive tasks test takers were given two tasks to complete. While involved in the tasks there were sensory interruptions. Seeing how the brain responded to the interruptions allowed Vartanian and Mandel to verify their hypothesis that the brain has both serial (i.e. linear) and parallel modes.
- They found using fMRI and EEG that the brain uses both serial and parallel processes with a cognitive task.
- Working through a complex cognitive task has three phases:
- A perceptual component (P)
- A central component (C)
- A motor component (M)
- Vartanian and Mandel found that only the central component (C) establishes a bottleneck; that is a slowing down of task completion. The thought is that both the perceptual (P) and motor (M) components may act in parallel, but that the central component can only work in a serial fashion.
In short, they propose that every decision has an initial component where the problem requiring a decision is perceived either by the senses or through meta-cognition (i.e. self-awareness). Then the perception moves on to the parts of the brain that consider a course of action; their (C) central component. After the decision is made then action is taken, the (M) motor component.
Both of the first two stages of the Vartanian and Mandel model, perceptual (P) and (C), central, have ramifications for explaining why behavioral bias manifests. Briefly, I believe that behavioral bias manifests due largely to an underdeveloped metacognition/self-awareness in the (P)erceptual stage of decision-making. That is, we fail to recognize or consider the correct thinking mode needed to solve a problem.
Within the (C)entral part of decision-making, the brain first does a check in with memory. If a problem is familiar or similar then we tend to default to System 1 thinking. If, on the other hand, it is unfamiliar then we tend to default to System 2 thinking. The reason that System 2 thinking is so slow is that deliberate thinking must be done serially and because working memory bandwidth is constrained. System 3 thinking on the other hand is fast but ends up taxing multiple parts of the brain simultaneously, and not just the prefrontal cortex. All decisions require evaluation and trigger hormonal/physiological reactions. This feedback mechanism is exactly where the biases arise. A decision made years ago that was a success when made again feels good. To improve our thinking requires interrupting this feedback loop and is time and energy inefficient in the short-run.
Understanding the steps above well is the subject of the future articles in this series, and it puts us on track to fully explain the effects we see in behavioral bias. In turn, this allows for predictions to be made based on A Theory of Behavioral Finance, as well as prescriptions for overcoming these biases.
[1] Kahneman, Daniel. Thinking, Fast and Slow. Farrar, Straus and Giroux 2011
[2] The original explication comes from: Stanovich, Keith E. and Richard F. West. “Individual differences in reasoning: Implications for the rationality debate?” Behavioral and Brain Sciences Vol, 23 (2000): p. 658
[3] From “OED: Oxford English Dictionary” Oxford University Press (2020) http://www.oed.com/view/Entry/98794?redirectedFrom=intuition#eid Accessed 16 November 2020
[4] A meta-analysis uses statistical techniques to combine multiple scientific studies into a single, over-arching study in order to better separate out signal from noise within data.
[5] Sprugnoli, Giorgio, Simone Rossi, Alexandra Emmerdorfer, Alessandro Rossi, Sook-Lei Liew, Elisa Tatti, Giorgio di Lorenzo, Alvaro Pascual-Leone, and Emiliano Santarnecchi. “Neural correlates of Eureka moment.” Intelligence (2017)
[6] Vartanian, Oshin and David R. Mandel (Eds.). (2011). Neuroscience of Decision Making. New York and Hove: Psychology Press.