A Look in the Mirror
John Mauldin's Outside the Box

Blog Subscription Form

  • Email Notifications



ancient Chinese philosopher Lao-tzu once said, "He who knows others is wise; he who knows himself is enlightened." Learning to overcome one's own emotions and bias can be one of the single hardest hurdles than an investor faces. It is equally challenging to discern between advice and noise by "experts" in any given field. Today's Outside the Box is written by my good friends Mark and Jonathan Finn where they address some of the psychological difficulties confronted by investors.

For those of you unfamiliar with this father son tandem, Mark (father) and Jonathan (son) run Vantage Consulting. Mark was the former chairman of the Virginia Retirement System and has a distinguished academic career. He consults with large pension funds and high net worth investors, as well as sits on the board of several large mutual fund families. Jonathan is well-known for his research on the topic of past performance and is a chartered financial analyst. In their article "A Look in the Mirror," they take a look at systematic errors in the investment field and how to deal with them.

We all have something to learn from looking introspectively in our decision making efforts. I hope you find this piece to be both enlightening and "outside the box."

John Mauldin, Editor

Best 7 Stocks for 2007
Discover seven stock winners that promise big things for 2007. You can get them absolutely free in this special report. Feel confident in these picks. The Director of Research of a leading independent investment research firm hand-picked them to provide to beat the market this year. Now they can be yours. Free.

Get the Best 7 Stocks for 2007 now.

A Look in the Mirror

By Mark T. Finn and Jonathan Finn, CFA



A Look in the Mirror

Consultants hold themselves out to be the experts. Most people assume that being an expert enables one to make better judgments than non experts. One key element of superior judgment is a superior forecast. Here at Vantage we have reviewed much of the research on experts' forecasting ability and found it very humbling. The focus of this year-end letter is why experts seem to know so much yet forecast so badly. Experts in many fields have been studied - doctors, military commanders, financial analysts, accountants, psychologists, etc. - and they all suffer from some of the same weaknesses and make similar errors.

First, let us acknowledge that experts clearly do some things better than non-experts. For example, experts process information more efficiently. Experts know a lot about their area of expertise. This allows them to search for information based on a limited set of relevant variables or cues. They can simplify complex problems as well as recognize unusual ones. Also, experts are better at measuring variables and discovering new ones and they can often explain concepts in their area of specialization better.

However, those and other skills don't seem to lead to superior forecasts in many cases. Indeed, many studies have shown that rather simple statistical models of the key variables experts look at are superior to the experts themselves when it comes to forecasting.

There are three major biases, or systematic errors, that seem to impact experts' ability to forecast. First consider that a key difference between experts and laymen is the extent to which experts have a complete and unbiased perspective of their field. Specifically whether they understand the characteristics of the relevant universe or population; for example, the population of cancer patients for an oncologist or the population of growth stocks for a growth stock investment manager. This is a critical element in forecasting.

When people make predictions there are two general categories of information available. There is all the information about the specific case at hand, and there is also the information about the population of similar cases. For example, if you were trying to predict the likelihood of success of a new restaurant, you would consider such things as location, menu, reputation, etc. But you should also take into account the information for the relevant population as a whole. In this case you need to consider that only 5% of all non-chain restaurants survive more than 5 years. Most people, including the experts that should be the most familiar with the relevant population statistics, tend to give too much weight to the specific case at hand and not enough to the base rate - the statistics about the relevant population as a whole.

It is disturbing to us how limited investment professionals are in their knowledge of the relevant population statistics. For example, a growth stock manager can be quite detailed about the prospects of a given growth stock he follows, but if you ask him for hard statistics on the population of growth stocks, often he doesn't know much. For example, does he know what percentage of growth stocks continue to grow at above average rates 10 years hence; or five years hence? Or does he know what percentage of new product introductions in an industry succeed historically? Are investors generally overly optimistic about the prospects of growth companies and if so, by how much? What is the rate of regression to the mean? What affects that rate?

A true expert would know the answers (based on hard data not anecdotes) to those kinds of questions and would know how much weight to give that information in his projections. We have often seen experts in the investment field rely too much on anecdotal experience rather than systematic analysis. Indeed it seems like the more experience an expert has, the more anecdotes there are at his fingertips and the more tempting it is to rely on those anecdotes; not just by way of example, but to form hypotheses and conclusions.

Indeed we might suggest that the guild system in which many investment professionals are trained tends to reinforce this reliance on anecdotal learning. Over generalizing from too small a data sample results in unreliable forecasts.


Key Statistics

Discount to Fair Value: 30%
Operating Profit Margin: 21%
Projected 2007 Earnings Growth: +20%
Introducing StreetAuthority.com's "Undervalued Stock of the Month" for January

Thanks to its well-known brand, customer base, and technology, this firm dominates the payroll processing market. However, due to short-term market volatility, this stock has pulled back 25% from its all-time high. As a result, bargain hunters now have a rare opportunity to purchase this company at a 30% discount below our estimated fair value.

Learn More

The second error experts make relates to their failure to be self-critical. While we think of experts as being objective - at least in regard to their area of expertise - this is often not true. Experts often have some strongly held beliefs. This sometime leads them to either seek out data that will confirm their existing beliefs or to underweight disconfirming data. This behavior can be seen in studies that examined the decision making process of personnel officers, psychologists, or admissions officers. In different studies on how these experts formed judgments about the individuals they were interviewing, it was determined that they formed impressions or hypotheses about the interviewees quite early on in the interview. Once they had formed those tentative conclusions, they spent the rest of the interview seeking information that would confirm their first impression and ignoring information that might refute those impressions.

A simple example of the tendency to seek confirmatory evidence is the so called "2, 4, 6" problem. In this classic experiment, subjects are asked to figure out the rule being used to classify sequences of three integers as correct or incorrect. Subjects were given an example of a correct sequence: 2, 4, 6. The subjects were then asked to discover the rule for evaluating a number sequence as correct or incorrect by trying various three-number sequences. The subjects would be informed whether the number sequences they proposed were correct or incorrect based on the rule. In other words were the stated number sequences consistent with the undisclosed number generating rule.

The actual number generating rule was "any three numbers in ascending order". However, by starting with the sequence "2, 4, 6" most subjects assumed the rule "ascending order with equal intervals" and then set about confirming that this rule was correct. If the subjects only look for confirming evidence, they will always be told that their sequence is correct. Because the true rule is a subset of their more complicated rule; subjects can only eliminate their hypothesis by testing disconfirming examples. For example, the sequence "1, 2, 4" would have provided much more information than continuing to offer examples of ascending order with equal intervals. Yet almost everyone searches for the rule by seeking confirming evidence, rather than disconfirming.

This failure to evaluate disconfirming evidence seems to affect everyone - including experts. Let's go back for a moment to the example of a personnel officer who uses certain criteria to screen job candidates who are likely to succeed on the job. Let's say 68% of the job candidates who passed his screening criteria and went to work for his company performed well at their job. The personnel officer is satisfied that his screens are effective. However, he doesn't know what percentage of candidates who failed the screen, would have nonetheless succeeded at their job if the company had hired them. Let us suppose that of the candidates who were rejected, 75% of them would have succeeded at their job. If this were true, then the screening criteria that the personnel officer used to judge applicants was actually a hindrance to selecting successful candidates. But since the personnel officer can never get feedback from the candidates he rejects, he never knows about the efficacy of his screening criteria.

We have seen analogous situations in the investment field. For example, one well regarded investment manager had a simple set of criteria to determine which stocks would fall into his universe of potential candidates. He prided himself on the detailed individual company analysis his organization did in selecting stocks for their portfolio from that candidate universe. His results appeared good. He outperformed the S&P 500 over one, five, and ten years. However, when one compared the performance of the entire candidate universe with the actual portfolios, the portfolios significantly under-performed the universe. In other words, all the individual company analysis he performed after screening down to a candidate universe actually detracted from his performance. If he looked for adequate feedback he would have asked questions about why it was that his simple criteria created a universe that outperformed the S&P 500. Did the universe have more risk? Did the universe have certain characteristics that were rewarded by the market for specific period of time? Or did his simple criteria indeed highlight under-priced stocks?

Unless one gets some form of feedback about decisions or judgments he cannot learn. This is especially true in cases where the hypothesis is partially correct or when there is great complexity involved. The greater the number of factors involved, the greater the scope there is for rationalizing the outcome of faulty predictions.

The third error experts fall prey to is illusory correlations. This relates to the tendency to see a relationship between variables when none exists. Indeed research has shown that the more data a person had to sift through to get his illusory correlation the more strongly he believed it. This brings us to an interesting point. It seems to take a fairly sophisticated understanding of statistics and economics to avoid this error. And many experts in the investment field simply don't have a deep enough understanding of inferential statistics. This includes most consultants.

Consider a classic experiment done in 1948 by B.F. Skinner. Paul Slovic describes the experiment as follows:
Skinner found that hungry birds, given food at brief random intervals, developed very idiosyncratic, repetitive actions. The precise form of this behavior varied from bird to bird, and Skinner referred to these actions as superstitions. What happened to these birds can be described in terms of the concept of positive reinforcement. The delivery of food increased the likelihood of whatever form of behavior happened to precede it. Food was then presented again. Because the reinforced behavior was occurring at an increased rate, it was more likely to be reinforced again. The second reinforcement caused a further increase in the rate of this particular behavior which improved its chances of being reinforced again, and so on. After a short while the birds were found to be turning rapidly counter clockwise about the cage, hopping from side to side, making odd head movements, etc. Because such behaviors are reinforced less than 100 percent of the time during learning, they persist even when reinforcement stops altogether. Animals trained in this way have been known to make many as 10,000 attempts to obtain a reward that was no longer forthcoming.

(Psychological Study of Human Judgment: Implications for Investment Decision Making, Paul Slovic, The Journal of Finance, Vol. 27, No. 4, Sep., 1972)
With only partial tongue in cheek we would point out a strong similarity between Skinner's experiment and investing in the stock market. There is the fertile ground of overwhelming data from which illusory correlations can be drawn. The outcome is expected to be positive (an increase in wealth). That expected positive outcome is associated with positive reinforcement (good performance) that is at least intermittent if not random. There are few statistically valid analyses provided as systematic feedback that would refute an illusory correlation.

An Investors Dream: How to Get High-Income Investments at a Discount
Let's face it, when Wall Street delivers good news...you want it to apply to you. And when the news is not so great, you want to avoid taking the hit in your portfolio.

Discover a strategy that works no matter what the market is doing. One that delivers low-risk, high-octane income, plus healthy capital gains to boot. You could be:
  • Collecting regular dividend and interest payments of up to 10% or more...
  • Regularly racking up double- and triple-digit short-term capital gains on trends that other investors are missing out on...
  • Even scoring the occasional out-of-the-ballpark hit like 1,600% that could have turned a $5,000 investment into an astounding $85,000 windfall in less than two months...
Find out more about one of the smartest approaches to investing that lines your pockets with cash today...and then does it all over again tomorrow.

Get Paid Today... and Again Tomorrow

It is very tempting to fall prey to illusory correlation. As Tversky & Kahneman put it:
Lifelong experience has taught us that, in general, instances of large classes are recalled better and faster than instances of less frequent classes; that likely occurrences are easier to imagine than unlikely ones; and that the associative connections between events are strengthened when the events frequently co-occur. As a result [people have] at [their] disposal a procedure (the availability heuristic) for estimating the numerosity of a class, the likelihood of an event, or the frequency of a co-occurrences, by the ease with which the relevant mental operations of retrieval, construction, or association can be performed. However, as the preceding examples have demonstrated, this valuable estimation procedure results in systematic errors.

(D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgment under uncertainty: Heuristics and biases, (p. 12). New York: Cambridge University Press.)
So what are our conclusions? First be skeptical of any predictions made by a so-called expert. Second spend the time to find out the assumptions and insight upon which the expert is relying. Third find out what documentation or evidence is available to justify a reliance on those assumptions. Financial markets are particularly difficult to forecast. Information flows in a torrent and at times one data point contradicts another. Faced with such a noisy environment, investors tend to lean on experts. This is rational but can easily be taken too far. Armed with a proper understanding of forecasting limitations, investors should adjust their reliance accordingly.

We all understand the world through experience and reason. We face the future conditioned by our past. What we haven't experienced can hurt us. The reality for investors is we face a risky future. While the best risk minimization tool is skill, even the most skillful cannot consistently shield investors from financially painful future outcomes. Here the authors find it valuable to remember the last prediction of General John B. Sedgwick, a Union Army Civil War officer, who uttered during the Battle of Spotsylvania in 1864: "They couldn't hit an elephant at this dist------".

Mark T. Finn
Mark T. Finn
Chief Executive Officer
Jonathan Finn, CFA
Jonathan Finn, CFA
Chief Investment Officer


Your not an expert on anything analyst,

John F. Mauldin


John Mauldin is president of Millennium Wave Advisors, LLC, a registered investment advisor. All material presented herein is believed to be reliable but we cannot attest to its accuracy. Investment recommendations may change and readers are urged to check with their investment counselors before making any investment decisions.

Opinions expressed in these reports may change without prior notice. John Mauldin and/or the staffs at Millennium Wave Advisors, LLC and InvestorsInsight Publishing, Inc. (InvestorsInsight) may or may not have investments in any funds, programs or companies cited above.


Communications from InvestorsInsight are intended solely for informational purposes. Statements made by various authors, advertisers, sponsors and other contributors do not necessarily reflect the opinions of InvestorsInsight, and should not be construed as an endorsement by InvestorsInsight, either expressed or implied. InvestorsInsight is not responsible for typographic errors or other inaccuracies in the content. We believe the information contained herein to be accurate and reliable. However, errors may occasionally occur. Therefore, all information and materials are provided "AS IS" without any warranty of any kind. Past results are not indicative of future results.

Posted 01-15-2007 4:18 PM by John Mauldin