My recommendation: 09/10
Summary of notes and ideas
Neuroscientists have recently uncovered two particular traits of significance to investors. The first is that we are hard-wired for the short term. We tend to find the chance of short<term gains very attractive. They appear to stimulate the emotional centres of the brain, and release dopamine. This makes us feel confident, stimulated, and generally good about ourselves.
These individuals did not become the walking optimizers known as homo economicus. Rather, in many cases, these individuals are now actually incapable of making decisions. They make endless plans but never get round to implementing any of them. Bechara et al. (1997) devised an experiment to show how the lack of emotion in such individuals can lead them to make suboptimal decisions.
Emotion, therefore, can both help and hinder us. Without emotion we are unable to sense risk, but with emotion we can’t control the fear that risk generates! Welcome again to the human condition!
It is also worth noting that we are very bad at projecting how we will feel under the influence of emotion - a characteristic that psychologists call ‘hot-cold empathy gaps’. That is to say, when we are relaxed and emotion free, we underestimate how we would act under the influence of emotion.
Unfortunately a vast array of psychological research (Muraven and Baumeister, 2000; Baumeister, 2003) suggests that our ability to use self-control to force our cognitive pro-cess to override our emotional reaction is limited. Each effort at self-control reduces the amount available for subsequent self-control efforts
Baumeister (2003) concludes the survey by highlighting the key findings of their research:1. Under emotional distress, people shift toward favoring high-risk, high-payoff options, even if these are objectively poor choices. This appears based on a failure to think things through, caused by emotional distress.2. When self-esteem is threatened, people become upset and lose their capacity to regulate themselves. In particular, people who hold a high opinion of themselves often get quite upset in response to a blow to pride, and the rush to prove something great about themselves overrides their normal rational way of dealing with life.3. Self-regulation is required for many forms of self-interest behavior. When self-regulation fails, people may become self-defeating in various ways, such as taking immediate pleasures instead of delayed rewards. Self-regulation appears to depend on limited resources that operate like strength or energy, and so people can only regulate themselves to a limited extent.4. Making choices and decisions depletes this same resource. Once the resource is depleted, such as after making a series of important decisions, the self becomes tired and depleted, and its subsequent decisions may well be costly or foolish.5. The need to belong is a central feature of human motivation, and when this need is thwarted such as by interpersonal rejection, the human being somehow ceases to function properly. Irrational and self-defeating acts become more common in the wake of rejection.
Contrarian strategies are the investment equivalent of seeking out social pain. In order to implement such a strategy you will buy the things that everyone else is selling, and sell the stocks that everyone else is buying. This is social pain. Eisenberger and Lieberman’s results suggest that following such a strategy is really like having your arm broken on a regular basis - not fun!
To buy when others are despondently selling and sell when others are greedily buying requires the greatest fortitude and pays the greatest reward Sir John Templeton
Be less certain in your views, especially if they are forecasts.• You know less than you think you do.• Try to focus on the facts, not the stories.• More information isn’t better information.• Listen to those who disagree with you.• Examine your mistakes, failures aren’t just bad luck.• You didn’t know it all along, you just think you did.• Judge things by how statistically likely they are, not how they appear.• Big, vivid, easy to recall events are less likely than you think they are.• Don’t confuse good firms with good investments or good earnings growth with good returns.• Use reverse-engineered models to avoid anchoring on the market prices.• Don’t take information at face value; think carefully about how it was presented to you.• Sell your losers and ride your winners.
Overoptimism and overconfidence tend to stem from the illusion of control and the illusion of knowledge.
‘The greatest obstacle to discovery is not ignorance - it is the illusion of knowledge.’ The simple truth is that more information is not necessarily better information; it is what you do with it, rather than how much you have, that matters.
That fact doesn’t stop the vast majority of investors desperately trying to accumulate more information than their rivals. The evidence suggests that, just like bookmakers, professional investors are generally much too confident.
We are capable of generating new brain cells over most of our lifetime. In addition, the brain isn’t fixed into a certain format. The easiest way of thinking about this is to imagine the brain as a cobweb. Some strands of that cobweb are thicker than others. The more the brain uses a certain pathway, the thicker the strand becomes. The thicker the strand, the more the brain will tend to use that path. So if we get into bad mental habits, they can become persistent.
So more information isn’t better information; it is what you do with it that truly matters. That fact doesn’t stop the vast majority of investors desperately trying to accumulate more information than their rivals. The evidence suggests that, just like bookmakers, professional investors are generally much too confident.
when the professionals were 100% sure they were correct, they were actually right less than 15% of the time! This fits with the mass of evidence that psychologists have uncovered: while experts may know more than non-experts, they are also likely to be even more overconfident than non-experts.
The strength of the information is represented by the high level of the glowing traits talked about; the weight of the information is very low because the author of the letter is a highly biased source. Tversky and Griffin (1992) have shown that, in general, a combination of high strength and low weight will generate overreaction, whereas low strength and high weight tends to create underreaction
We have a very bad habit of looking for information that agrees with us. This thirst for agreement rather than refutation is known as confirmatory bias. When Karl Popper wrote his philosophy of science, he stated that the only way of testing a view is to form the hypothesis and then spend the rest of the day looking for all the information that disagrees with it.
Our natural tendency is to listen to people who agree with us. It feels good to hear our own opinions reflected back to us. We get those warm, fuzzy feelings of content. Sadly, this isn’t the best way of making optimal decisions self-attribution bias. This is the tendency to attribute good outcomes to skill and bad outcomes to sheer bad luck. This is one of the key limits to learning that investors are likely to encounter. This mechanism prevents us from recognizing mistakes as mistakes, and hence often prevents us from learning from those past errors.
People judge events by how they appear, rather than by how likely they are. This is called representativeness.
Simonsohn et al. (2004) show through a series of experiments that direct experience is frequently much more heavily weighted than general experience, even if the information is equally relevant and objective.
In fact, in one of their experiments, Simonsohn et al. find that personal experience is weighted twice as heavily as vicarious experience! All of this means that investors’ experience will be a major determinant of their perception of reality.
Inattentional blindness is a phenomenon in which people fail to notice stimuli appearing in front of their eyes when they are preoccupied with an attentionally demanding task
Think about these effects the next time you’re considering a particular company. If you already hold stock in that company, you may actually impute a higher value than is warranted, simply because you already own the shares. You are likely to enter a meeting with company management expecting to be convinced that any concerns you had are misplaced.
- You know less than you think you do. 3. Try to focus on the facts, not the stories.4. More information doesn’t equal better information. 5. Think whether a piece of information is high strength and low weight, or low strength and high weight .6. Look for information that disagrees with you. 7. Your failures aren’t just bad luck; examine mistakes to improve your performance. 8. You didn’t know it all along, you just think you did. 9. If you can’t debias, then rebias - we know people will anchor on the irrelevant, so let’s replace the unimportant with the relevant. Set up a sensible valuation framework. 10. Judge things by how statistically likely they are, not how they appear. 11. Don’t overweight personal experience. 12. Big, vivid, easy to recall events are less likely than you think they are. 13. Don’t take information at face value; think carefully about how it was presented to you.14. Don’t value something more, simply because you own it.15. Sell your losers and ride your winners.
Whereas people overestimate the influence of emotional situations on the intensity and duration of their feelings (impact bias), they underestimate the influence of emotional situations upon choice (empathy gaps). I
Whereas people overestimate the influence of emotional situations on the intensity and duration of their feelings (impact bias), they underestimate the influence of emotional situations upon choice (empathy gaps). It is the latter that we are more likely to encounter in financial markets.
So all the evidence points to the fact that people overestimate the impact of emotions on their feelings, but underestimate its impact on their choices and preferences. From a psychological perspective this differing reaction helps to highlight the difference between feelings and choices.
Reducing impact bias: • Showing people examples of progression or change. Simply showing a picture of a declining line seems to be enough to prime people to remember that emotions decline over time! • Asking people to remember a wide variety of past emotional events and their feelings. A single event will not suffice; availability bias ensures that if people are asked to remember a single event they will pick an extreme one. So a wide range of events is needed to combat this problem. • Asking people to think about background events and activities rather than the easily available aspects. In the study mentioned earlier about football fans, when the fans were asked to consider activities they would be doing - i.e. studying, socializing and working - they made far more moderate forecasts of their happiness as a result of their team winning.
Structuring choice situations so that options are not compared in a side-by-side fashion. Side-by-side comparison makes features salient (as in the housing experiment discussed earlier). So preventing this allows people to take in other features. Alternatively, asking people to look for similarities or shared features rather than concentrating on the differences could also help to mitigate the problem.
Assuming that risk is rewarded in financial markets, then the best investors are likely to be those who can keep their emotional reactions to loss in check. This also offers insights into why it is so hard to be a contrarian investor. If you follow such a strategy you will by definition be buying the assets that everyone else is selling, and selling the assets that everyone else is buying. Buying assets that have gone down is likely to trigger the emotion of fear, and it is this that we need to overcome. However, to be not only a successful investor but a successful human being as well, we need all our emotions, including fear, if we are to avoid making poor life choices. The challenge to each of us is to develop this ability to turn our emotions on and off (or, more accurately, know when to react to them and when to override them) depending on the circumstance
The major reason we don’t learn from our mistakes (or the mistakes of others) is that we simply don’t recognize them as such. We have a gamut of mental devices all set up to protect us from the terrible truth that we regularly make mistakes.
To combat the pervasive problem of self-attribution we really need to keep a written record of the decisions we take and the reasons behind those decisions. We then need to map those into a quadrant diagram, as shown in Table 6.1. That is, was I right for the right reason? (I can claim some skill, it could still be luck, but at least I can claim skill), or was I right for some spurious reason?
Only by cross-referencing our decisions and the reasons for those decisions with the outcomes, can we hope to understand when we are lucky and when we have used genuine skill.
The basic failure with the pigeons and Ono’s human experiments is that they only look at the positive co-occurrences, rather than look at the percentage of the times the strategy paid off, relative to all the times they tried.
In their analysis of a wide range of illusion of control studies, Presson and Benassi (1996) summarize that the illusion is more likely when lots of choices are available, you have early success at the task (as per above), the task you are undertaking is familiar to you, the amount of information available is high, and you have personal involvement.
We have outlined four major hurdles when it comes to learning from our own mistakes. Firstly, we often fail to recognize our mistakes because we attribute them to bad luck rather than poor decision-making. Secondly, when we are looking back, we often cannot separate what we believed beforehand from what we now know. Thirdly, due to the illusion of control, we often end up assuming that outcomes are the result of our actions. Finally, we are adept at distorting the feedback we do receive, so that it fits into our own view of our abilities
Some of these behavioural problems can be countered by keeping written records of decisions and the ‘logic’ behind those decisions. But this requires discipline and a willingness to re-examine our past decisions. Psychologists have found that it takes far more information about mistakes than it should, to get us to change our minds.
When Karl Popper wrote his philosophy of science he argued that the only way of testing a hypothesis was to form the view and then spend the rest of the day looking for the evidence that proves you to be wrong.
There are at least five psychological hurdles that must be overcome if meeting companies is to add value to an investment process. Firstly, consider the point just made above. More information isn’t better information, so why join the futile quest for an informational edge that probably doesn’t exist? Secondly, corporate managers are just like the rest of us. They tend to suffer from cognitive illusions, so their views are likely to be highly biased. Thirdly, we all tend to suffer from confirmatory bias - that is, a habit of looking for information that agrees with us. So rather than asking lots of hard questions that test our base case, we tend to ask nice leading questions that generate the answers we want to hear. Fourthly, we have an innate tendency to obey figures of authority
One key finding in the literature on overconfidence is that experts are even more overconfident than lay people. Experts do know more than lay people, but sadly this extra knowledge seems to trigger even higher levels of overconfidence.
Unfortunately, all the evidence suggests that people’s impressions of their skills and abilities are at best moderately correlated and frequently uncorrelated with their actual performance.
However, look what happened to the bookmakers’ confidence. It soared as the information set increased. With five pieces of information, accuracy and confidence were quite closely related. However, by the time 40 pieces of information were being used, accuracy was still exactly the same, but confidence has soared to over 30%! So more information isn’t better information, it is what you do with it rather than how much you can get that truly.
Once again they seemed to think that everyone was guilty! It is also interesting to note that confidence increased with training and experience but accuracy didn’t!
The trained students and the investigators underperformed the naïve students (see Figure 12.11). Once again they seemed to think that everyone was guilty! It is also interesting to note that confidence increased with training and experience but accuracy didn’t!
One particular form of the curse of knowledge is once we know something, we can’t ever imagine thinking otherwise. This makes it hard for us to realize that what we know may be less than obvious to others who are less informed. Better informed players are unable to ignore private information even when it would be in their interest to do so, more information isn’t better information!
So our market can be characterized by the following facts:• Most players use one, two (three) or infinity levels of thinking.• A significant number who reach 0 choose a number greater than 0 because they believe that others are boundedly rational. However, virtually all such players still ended up choosing too low a number.• A significant proportion of players do choose either 0 or 1 - doomed by their own rationality, or the curse of knowledge
If our beauty contest is a proxy for market behaviour, then most investors seem to practise two steps of strategic thinking. In Keynes’s parlance, most investors seem to be concerned with ‘The third degree where we devote our intelligences to anticipating what average opinion expects average opinion to be’.
Of course, when investors are myopic they tend to check their performance frequently. The more frequently they examine their portfolio performance, the more likely they are to encounter a loss. Such myopic behaviour almost becomes self-fulfilling.
Spinoza argued that comprehension and belief were a single step - that is to say, in order for somebody to understand something, belief is a necessary precondition. Effectively all information or ideas are first accepted as true, and then only sometimes evaluated as to their truth, once this process is completed a ‘corrected belief’ is constructed if necessary.
Under a Cartesian system, the person is left merely with an understanding of a false idea, but no belief in it. However, if people are better described by a Spinozan approach then interrupting the process should lead to a belief in the false idea. So giving people ideas or propositions and then interrupting them with another task should help to reveal whether people are Cartesian or Spinnozan systems when it comes to beliefs.
When the message was weak, people who were highly distracted showed much more agreement with the message than did the people who only suffered mild distraction. When the message was strong and distraction was high, the students showed less agreement than when the message was strong and the distraction was low. Distraction did exactly what it was meant to do… prevented people from concentrating on the important issue
Therefore, regularly confronting beliefs with empirical reality is one way of trying to beat the Spinozan syste
When you are really trying to assess the validity of an argument do your best to avoid distraction. Turn off your screens, put your phone on call forward, and try to cut yourself off from all the sources of noise. Of course, management and colleagues may well think you have taken leave of your sense as you sit there with your screens off, but try to ignore them too.
The second potential belief control mechanism is called ‘exposure control’. This is a far more draconian approach than ‘unbelieving’.
Interruption had no effect on the correction identification of a true proposition (55% when uninterrupted vs 58% when interrupted). However, interruption did significantly reduce the correct identification of false propositions (55% when uninterrupted vs 35% when interrupted)
Therefore, regularly confronting beliefs with empirical reality is one way of trying to beat the Spinozan system.
Psychologists have spent many years documenting the fact that group decisions are among the worst decisions ever made, effectively endorsing the view that committees are groups of people who keep minutes but waste hours
However, for a statistical group to be a useful device, three conditions must be met:1. People must be unaffected by others’ decisions (effectively their errors must be uncorrelated).2. The probability of being correct must be independent of the probability of everyone else being correct.3. The participants must be unaffected by their own vote possibly being decisive
However, the reality of group behaviour is very different. As MacCoun (2002) notes, “Groups generally can be expected to amplify rather than correcting individual bias
Additionally, group discussion tends to lead to group members having more confidence in their decisions after the group deliberations (Heath and Gonzalez, 1995). However, sadly, this increased confidence is not matched by increased accuracy.
Deliberative groups also show an alarming inability to uncover information that isn’t common knowledge, and instead end up centred on the knowledge that is easily available to all the group members
The general finding from a wide variety of such experiments is that unshared information is highly likely to be omitted from the discussion. Instead, members of the group will tend to concentrate on shared information leading to a hardening of view, creating an anchor for the subsequent discussions.
Wittenbaum et al. (1999) have also shown that those who share information that confirms the group’s views are seen as competent and more credible by their peers and by themselves! A situation Wittenbaum et al. call mutual enhancement.