Which type of people make the best predictions?

To understand another key to making better forecasts, let’s take a look at what kind of people make the best predictions.

Beginning in 1987, a psychologist and political scientist by the name of Philip Tetlock began recording predictions on topics like politics and the economy made by a wide variety of experts.

After analyzing the accuracy of these predictions as well as the personalities and thinking styles of the experts that made them, Tetlock began to see a clear pattern.

It turned out that the more successful predictors were likely to use strategies where they tried to integrate many little pieces of knowledge, whereas the less successful ones tended to just hang on to one big idea or fact.

He named these two types of people the hedgehogs and the foxes.

Hedgehogs are typically brash and confident, claiming that they have discovered big, governing principles that the world adheres to: think Sigmund Freud and the unconscious.

Foxes, however, tend to be more cautious and meticulous, contemplating matters from various perspectives and carefully weighing the pros and cons. They are also more likely than hedgehogs to rely on empirical evidence and data, willingly discarding their own ideologies and preconceptions and letting the data speak for itself.

Of course, hedgehogs’ confidence has much more media appeal, hence their predictions tend to garner far more attention than the foxes’. But, at the end of the day, it was the foxes who had better predictions. In fact, the hedgehogs’ predictions were overall only little better than ones made at random.

So it seems that good predictors are made good by taking into account many different factors and weighing them from as many perspectives, not by relying on simple, big truths.

One notoriously difficult phenomenon to predict is the short-term behavior of the stock market. Sure, in the long run, the average value of stocks tends to increase, but this information is of little use to most traders because they want to “beat the market.”

A desire that has proven to be spectacularly difficult to fulfill.

For one, it’s hard for one person to predict the behavior of the market well. In fact, one study showed that when seventy economists made stock market forecasts over a multi-year period, their aggregate prediction was always better than any individual’s.

This difficulty was also seen in a study of mutual and hedge funds: it found that, just because one fund did well in a particular year, it was no more likely than the others to beat its competition in the following year. Clearly, the success was just a fluke, and no fund was really better at predicting the market than the others.

Why is beating the market so difficult?

Because the stock market is usually very efficient, i.e., there are no easy, sure-fire wins to be had. Most trades are made by very smart, capable people on behalf of large financial institutions which have huge amounts of data and expertise at their disposal. This means that if a stock is over- or underpriced, the market will correct that very quickly.

The only surefire way to beat the market is to know something no-one else does. And the only source for such an advantage tends to be illegal insider information. Interestingly, one group of investors which seem particularly adept at beating the market are members of Congress, whose investments tend to exceed the returns of the overall market by 5 to 10 percent annually. This is especially noteworthy given that they are privy to insider information through lobbyists, and can also impact the business prospects of companies by means of legislation.

Though the stock market usually tends to be efficient, this is not true when bubbles form, meaning situations where stocks are overvalued.

Though there is no foolproof way to predict a bubble, there are some telltale signs.

First, there’s the obvious clue: a sharp increase in stock prices in general. In fact, looking at historical cases when the S&P 500 stock market index increased at double its long-term average over five years, in five out of the eight cases it ended in a severe crash.

Second, you can keep your eye on the price/earnings, or P/E ratio, of stocks: the market price per share divided by the total annual earnings of the company per share.

In the long run, the P/E ratio of the entire market tends to be around 15. This means that if the average P/E ratio in the market is much over that, say 30 – as it had been at the height of the dot-com bubble in 2000 – you have a pretty good indication that a bubble is forming.

But why do such bubbles form? Shouldn’t investors spot them and sell, thus bringing the prices back down?

Well, actually, if you think about it, they shouldn’t.

You see, most institutional investors invest on behalf of their firm and their clients. When they perform well, they get huge bonuses, and when they perform badly, they might be fired.

Thus, even when they see a bubble is forming, they keep buying and reaping bonuses as the market soars. When the crash does eventually happen, they’ll only have lost their company’s and their clients money, not their own.

What’s more, all their colleagues do the same, so it’s unlikely they’ll be singled out and fired. In fact, after the last three big crashes on Wall Street, only about 20 percent of staff lost their jobs, so there’s an 80 percent chance traders will keep their jobs even if they ignore the bubble.

Like the economy, the climate comprises a very complicated, interrelated system which is very difficult to model and make predictions about.

Even very sophisticated models that take into account countless factors, like El Nino cycles and sunspots, have failed spectacularly. For example, the International Panel on Climate Change (IPCC) based its 1990 prediction on such a complicated model, stating that, over the next hundred years, global temperatures would increase by between two and five degrees, with three degrees being the most likely result. But observations made over the next eleven years indicated this was completely wrong: the pace was only 1.5 degrees per century, below even the lowest end of their IPCC estimate.

Climate scientists themselves are well aware of how difficult modeling is: while almost all of them agree that climate change is occurring due to human activity, they are far more skeptical of the accuracy of their models and the likely effects climate change will have. For example, only 19 percent felt that their models of sea levels rising due to climate change were any good.

So it seems that climate models using lots of data are not accurate. But could we find a simpler model, one that pays attention only to the signal, and not the noise of countless variables?

It turns out that the level of CO2 in the atmosphere is that signal. Simple models from the 1980s that rely only on current and projected levels of CO2 do a far better job of predicting global temperature development than later, more complicated ones.

What’s more, this relationship is not a mere statistical fluke because it has plausible a cause and effect. The greenhouse effect is a well-established physical phenomenon: greenhouse gases like CO2 accumulate in the atmosphere and trap heat in it.

Unfortunately, accurate predictions are only one part of the solution: nations need to take collective action to change the trend.

We’ve all heard of conspiracy theories relating to the 9/11 World Trade Center attacks. Some claim the attacks were so obviously predictable that the US government must have known about them in advance.

For example, in July 2001, there was a warning about heightened Al-Qaeda activity, and in August 2001, an Islamic fundamentalist was arrested due to his suspicious request to be allowed to practice flying on a Boeing 747 simulator. Also, previous terrorist plots of flying commercial jets into buildings had already been discovered.

But actually, the meaning of these signals is only obvious in retrospect. At the time, all this was mere noise: security agencies charged with preventing terrorism have to sift through hundreds of thousands of such potential leads, the vast majority of which lead nowhere.

Nevertheless, the US government should not have been as surprised by this large-scale attack as it was.

Why?

Because data indicates that such attacks are actually to be expected: the frequency and severity of terrorist attacks follows a pattern known as Clauset’s curve. Basically, when attacks are grouped together based on how many fatalities they caused and then their frequency is plotted, we see a very predictable double-logarithmic curve where attacks become less frequent the more devastating they are.

Clauset’s curve clearly indicates that an attack on the 9/11 scale happens roughly once every eighty years, so the government should have been open to this possibility.

The good news is that it seems Clauset’s curve is not set in stone.

Israel seems to have found a successful way of negating the upper end of Clauset’s curve by focusing almost all of its efforts on preventing large-scale attacks, while treating smalls-scale attacks as something almost akin to mere crime. The result has been that, since 1979, no attacks have claimed more than 200 people at once.

Clearly, there is something to this approach that other nations could learn from.

Experts in many areas tend to make astonishingly poor predictions, yet voice far too much confidence in their accuracy. They all trawl through data looking for correlations but, in a world with rapidly increasing amounts of data, this is bound to result in coincidental patterns that will eventually backfire

Check out my related post: What is the future of Artificial Intelligence?


Interesting reads:

https://www.goodreads.com/book/show/13588394-the-signal-and-the-noise

7 comments

Leave a reply to abetterman21 Cancel reply