The fact that we often see events that are supposed to be super rare exposes how out of touch the model parameters are.
Discussion
Yeah it’s all a bit bollox, I’m sure we’ve had 1 in 100 year weather events 20x in the last few years.
But then we’ve also got a shit load of idiots working in the media!
Yeah, the problem is the model assumption. If you’re going to predict an event, you have to know (or assume) a distribution from which to pick random variables.
When these reports say “three sigma” event, they signal that their distribution is a normal distribution. Some events have been observed in the past, and a mean and standard deviation were computed.
But if the distribution were truly normal, then these unlikely events would occur less frequently than observed.
For example, if your statistics tell you that a particular, independent, random event should happen roughly once every hundred years, then the chances of it happening in any year is 1/100 = 0.01. Twice in two years would be 1/100 * 1/100 = 0.0001. That is, two back-to-back instances should only occur once in 10,000 years.
The more unlikely the observed event, the more you have to question whether it was actually your estimation method that was at fault.