In case you’re wondering, I believe that global climate change is real and that it’s going to happen. Actually, it’s not much of a belief, it’s more of an acceptance of the facts. Last year, roughly 97% of American scientists say that climate change is happening, and they know a hell of a lot more about the facts than I do.
There is, however, a wide range of disagreement within that agreement. People can argue just how much humans are to blame, just how quickly things are moving along, just how bad things are likely to get and how much we can do to stop it. And even within agreements in those smaller categories, there’s still minor discord.
The reason is that the climate is a seriously complex beast to tackle. There’s a lot of scientists out there with a lot of good – but varying – data that they plug into their own logical – but again varying – climate models. The result is that you get different outcomes, differences in opinions on what the temperatures will be in 50 years, for example.
Noel Cressie, professor of statistics and director of Ohio State’s Program in Spatial Statistics and Environmental Statistics, thinks he’s found a good way around these disagreements.
Cressie has developed a statistical analysis method that takes the results from other climate change models, assigns a degree of certainty to each resulting data point, and combines it all into one result. It finds the commonalities between models and determines how much weight each individual projection should get in a consensus estimate.
So, for example, if 90% of the models agree on the change in the rate of heat transfer between the surface and atmosphere caused by shrinking ice coverage, the equations including those rates are given a higher level of certainty than, say, the likely carbon emissions of the world in 30 years time.
For the inaugural study using the analysis, Cressie used two regional climate models developed for the North American Regional Climate Change Assessment Program. The resulting data gave information on blocks of about 30 miles on a side throughout North America.
In short, the study said that everything was going to get warmer during all seasons, by roughly an average of 4.5 degrees Fahrenheit by 2070.
The Hudson Bay area in the northeast part of the continent is likely to experience the biggest change over the winter months, with temperatures estimated to rise an average of about 10.7 degrees Fahrenheit – possibly because ice reflects less energy away from the Earth’s surface as it melts. Hudson Bay summers, on the other hand, are estimated to experience only an increase of about 2.1 degrees Fahrenheit.
So no need to buy bathing suits in Toronto quite yet.
The Midwest and Great Lakes regions will experience a rise in temperature of about 5 degrees Fahrenheit, regardless of season. The Rocky Mountains region shows greater projected increases in the summer (6.3 degrees Fahrenheit) than in the winter (4.1 degrees Fahrenheit).
Of course, this is all still a bit of guesswork in my opinion. My go-to weather site www.weather.com can’t even tell me whether or not it is going to rain in an hour or not. I kid you not, I’ve seen 10% chances of rain two hours away turn into 90% chances an hour later.
In my opinion, there are just too many unknowns and too much uncertainty. Equations that govern the flow of heat and energy through the atmosphere are extremely simplified at best, and make a lot of assumptions that may or may not be accurate. Even the models that most everyone agrees on could well turn out to be inaccurate. So at best, this study is guesswork based on guesswork. True, they are highly educated guesses, but still estimates and assumptions in my mind.
That’s not to say that I think its all rubbish. I still say that pretty much every single one of the models still predicts that things aren’t looking all that great for the future. It’s just that trying to predict the exact temperature changes for different seasons in different regions can’t be an accurate process, so take all exact numbers you see with a grain of sand.