WunderBlog Archive » Dr. Ricky Rood's Climate Change Blog

Category 6 has moved! See the latest from Dr. Jeff Masters and Bob Henson here.

Lessons from the Data

By: Dr. Ricky Rood, 3:45 AM GMT on August 17, 2007

Hi, I am back for a blog or two. Some interesting things going on --- in the world of blogs, a lot of discussion about the error in the GISS temperature data set.

In a couple of my early blogs I talked about the scientific process and how part of the process was challenging the conclusions. Also important was the idea of validation and especially validation by independent investigators. Here are links to my old blogs. A Belief in Science. Incoherent Surface Temperatures.

Steve McIntyre of ClimateAudit.org played a role in finding this error. He has written a blog on finding the error and the significance of the error, which appeared on the Watts Up with That web site.

To me, this is first and foremost an example of the scientific process working. Given observations, examination, and validation, the result of the investigation was incorporated into the accepted knowledge. And the result changed the data record in a detectable way. Does it change the basic conclusions about a warming Earth? No. There is an amplified response to the change because so many people are attuned to climate change.

The climate data records are continuously being processed and reprocessed. This arises because so many of our observing systems were not originally designed with the stability and calibration to determine climate trends. Therefore, we have had build these climate records and take into account thermometer locations, equipment manufacturers, etc., etc. This has also been true with satellite observations, which suffer tremendous calibration challenges with launches and solar wind and all of that.

People have talked about how to correct bad data, like it is almost by definition an impossible thing to do. It is possible to correct "bad" data if, for instance, you know why the data is bad. There could be an alignment problem which, if quantified, might be accounted for almost exactly. It is possible to correct "bad" data, if you know that an instrument is biased. A lot of us know (or hope) that our scales weigh five pounds over. We determine this by using scales of known calibration. And this leads to another way to correct "bad" data, if we have other instruments for comparison. Many of you probably do not know that every single observation used in a weather forecast undergoes quality control by comparison with a very short-term forecast and all nearby observations (buddy check) before it is used to initialize a weather forecast. Sometimes data are "corrected;" for instance, a balloon sounding that has been reported upside down. This is a problem faced with every measurement, in every science, including the tests that determine your cholesterol.

This is one reason why multiple data sets and multiple analyses is so important. In this case the data set produced by the Hadley Center is an example ... HadCRUT3 data sets. For you purists, here is perhaps the most consistent data set available, the temperature measurements from Central England.

HadCRUT3Figure 1, Central England Temperatures, 1772 --> present. Tell me what you see in this figure.

A final point I would like to make. A lot of people say this data is "not available." It is there for all to get, and that is why McIntyre's analysis was possible. That's part of science as well, the data are available for independent analysis.


(With all of these blogs, I must be treading on the edge of the blogosphere. Why haven't I shown up in wikipedia yet. I am neither the late wrestler nor the violinist.)

The views of the author are his/her own and do not necessarily represent the position of The Weather Company or its parent, IBM.