Paul Klemperer, an Oxford University economist, uses a curious chain of logic to argue in the Financial Times that uncertainty about the extent of climate change should led us to be more, not less, worried about its potential risks. Klemperer uses the following analogy to turn the argument by skeptics that climate change forecasts are speculative and unreliable on its head and argue that uncertainty about climate change forecasts should make us more and not less concerned about global warming:
"If, like many of my neighbours in Oxford, you believe that new building exacerbates flooding, how would you feel if models that predicted bad news were discredited?"
I have never been to Oxford (and it's not high on my list of must-see places), so I must assume that Oxford faces a flooding threat such that, in the belief of many people living there, constructing new buildings in Oxford increases the risk of flooding.
Klemperer answers this question with the following statement:
"It depends. If the original models were biased, your best guess of the height of future floods is now lower. But if the models merely underestimated the uncertainty, the range of plausible outcomes is now greater, so flood defences would need to be higher for us to feel safe."
Klemperer is therefore assuming there are two possible outcomes:
1. The mean value of the perceived risk in fact lower than previously assumed.
2. The shape of the distribution of possible perceived outcomes is in fact flatter, that is, perceived probabilities of both lesser and greater flooding threat than before have in fact become higher.
Klemperer concludes his argument by stating that:
"Likewise, if our understanding of climate systems is flawed, our best guess about the dangers we face may be less pessimistic, but extreme outcomes are more likely."
Now, suppose we have a single measurement estimating something, for example the current population of New York City. It is necessarily true that the mean value of the estimation is the same as this one value and, since it is the only value, there is no variation around that value. Suppose that we then get a second measurement of New York's current population and that this second measurement is smaller than the first. Then, no matter how much smaller the second measure is from the first, the standard deviation around the average of the two is such that, given the revised view of the world based on the updated information, the distribution of possible current populations for New York City includes values greater than the first measurement. As this simple example points out, Klemperer is in fact arguing that both of the above possible outcomes are true and therefore, by necessity, any additional information on the risks of global warming suggesting its potential risks are less than currently thought actually increases our perception of the probability that an extreme outcome could in fact take place.
Klemperer is essentially assuming that new information is no more likely to be true than existing information. But if new information is no likely to be true than existing information and if the addition of that information necessarily increases the perception of increased risk regardless of whether the new information confirms or challenges the existing information, why bother collecting further information?
This exposes the fatal flaw in Klemperer's argument. Science is concerned with developing hypotheses and collecting data to confirm or refute these hypotheses. Assuming, as Klemperer does, that any additional information by definition confirms the presence of increased risk even if the additional information challenges that belief constitutes a belief in the veracity of the original information more characteristic of faith than science.