McIntyre said, “Defenders of the weather station system argued that NASA had software that could fix that data…And, so I wrote to NASA in May and asked them for the source code for the adjustment software that they used to fix these stations and they refused to provide it.”
But, “the adjustments are not small,” McIntyre said. “The adjustments that they make are fully equal the total amount of warming in the United States the past century.”
According to McIntyre, when he began downloading data from NASA’s website to compare the adjusted and the raw data from the polling stations, “this led to a bit of a fight with NASA in May. As I started downloading the data in sequence they cut off my access to the data.”
“They blocked my IP address,” McIntyre said.
When contacted by phone to verify the computer block NASA spokeswoman Leslie McCarthy said, “This is the first I’ve heard of this.” McCarthy had not yet responded to the full transcript at the time of publication.
“After I was blocked and I explained myself they still didn’t want to let me have access to the data,” McIntyre lamented.
He continued: “They just said go look at the original data. And I said no, I want to see the data you used. I know what the original data looks like. I want to see the data that you used. But one of the nice things about having a blog that gets a million and half hits a month is that I then was able to publicize this block in real-time and they very quickly withdrew their position and allowed me to have access.”
When he got the data, McIntyre then compared the raw and adjusted data sets for all 1200 U.S. weather stations. “Probably 75 percent of the stations had jumps of at least a quarter degree in the year 2000,” he said.
Conservative media personalities like talk radio host Rush Limbaugh and blogger Michelle Malkin blasted the revision that was made quietly.
The Director of NASA’s Goddard Institute for Space Studies James Hansen responded to the critics on the left-wing blog DailyKos. He said that U.S. temperature data change is inconsequential to overall global climate data.. He wrote a diary on their site on August 11 that said, “The effect on global temperature was of order one-thousandth of a degree, so the corrected and uncorrected curves [on global data] are indistinguishable.”
Jeff Kuerter, president of the George C. Marshall Institute, said NASA’s mistake cast doubt on all global climate data because the United States was considered the best at taking analyzing temperatures. “If the U.S. doesn’t get this right what might be happening in other places and why did this error persist so long?” he said.
In an August 13 Newsweek cover story, “The Truth About Denial,” Kuerter’s organization was labeled as part of the “denial machine” in cahoots with Exxon Mobil and the American Petroleum Institute.
Exxon Mobil spokesman Gantt Walton said Exxon had no comment regarding NASA’s climate change revisions.
Even though the data has been corrected, McIntyre is not satisfied. “They claim that they’re adjustment methodology was capable of fixing bad data, I mean, that’s the point I want people to take home from this,” he said. “What they’ve done now is inserted a patch into an error that I identified for them but they haven’t established that the rest of their adjustment methodology is any good.” He recommended that NASA begin archiving the codes they use to make calculations and subject data to public scrutiny or peer-review.
This isn’t the first time McIntyre has caused a stir by questioning global warming data. The Toronto-based McIntyre joined forces with Canadian economist Ross McKitrick to refute data put forth by United Nations in 2001 that said use of fossil fuels was causing global warming. Included in the report was a graphic that showed 20th century temperatures sharply rising as time went on in the form of a hockey stick, which later became the name of the graph. McIntyre and McKitrick found an error in the mathematical calculation used to construct the “hockey stick.” Their findings led to a congressional investigation led by then-chairman of the House Energy and Commerce Committee Rep. Joe Barton (R.-Tex.).
Below is Carpenter’s transcript of her interview with Steven McIntyre:
Q. Can you explain to me in layman’s terms how you found this error?
Yeah. Quickly, a fellow in California named Anthony Watts noticed that some of the weather stations used to make historical U.S. statistics were located in places they weren’t supposed to be. One of them was in a parking lot and the trend for the station in a parking lot was way up and a nearby station that was in a proper location in a rural area was relatively flat. So, this led to some controversy and he started a volunteer effort where people started surveying these weather stations and seeing what they looked like.
Now, defenders of the weather station system argued that NASA had software that could fix that data. And, so it really didn’t matter if the station was in a parking lot in Tuscan or something like that. NASA software could fix it. So, that type of adjustment is a statistical issue that interests me. And, so I wrote to NASA in May and asked them for the source code for the adjustment software that they used to fix these stations and they refused to provide it. So I got interested in sort of looking at comparing the version of the temperature history of individual stations that NASA had against original data. I noticed that in some cases there was a very sharp jump in the differences between these two versions. The NASA version took a step in January 2000 relative to the original data. So, I then collected the data for both the NASA versions and the original data for all 1200 stages in the US historical network.
This led to a bit of fight with NASA in May because as I started downloading the data in sequence they cut off my access to the data.
Q. Meaning, your computer?
They blocked my IP address.
Q. Why were they so opposed?
Well, first of all they probably weren’t used to, they don’t have a very efficient distribution of the data so I ended up scraping the data off various web pages and I had written a computer program to do that. So, I was repetitively downloading data. Anyway, even after I was blocked and I explained myself they still didn’t want to let me have access to the data. They just said go look at the original data. And I said no, I want to see the data you used. I know what the original data looks like. I want to see the data that you used. But one of the nice things about having a blog that gets a million and half hits a month is that I then was able to publicize this block in real-time and they very quickly withdrew their position and allowed me to have access.
Once they did that, I downloaded all 1200 stations and calculated the value of this step in the year 2000. In some cases it was a negative step and in some cases it was a positive step, but it became clear that what they had done they had, for some reason, changed the version of data that they were using in 2000. Before 2000 they were using an adjusted version of data and after 2000 they were using an unadjusted version.
After the controversy broke out NASA has said that the reason they did that was because the adjusted version was never available after 2000. That’s actually untrue. The adjusted version is sitting in exactly the same data directory. It just seems to be an error of some kind on their part.
The amount on individual stations and this is where we started, trying to explain problems with individual stations, had jumps of up to one degree centigrade. I calculated a distribution of these jumps for all 1200 stations. Many of the jumps were negative, but the number of small jumps was itself only a fraction. Probably 75 percent of the stations had jumps of at least a quarter degree in the year 2000. But the average, because there both positive and negative ended up being somewhat over .15 degrees. That doesn’t necessarily seem that much, but when the entire increase in temperature in the United States had been previously reported to be about half a degree, this .15 degree is not a small number when you are measuring half degree numbers.
So, I sent them an email notifying them of this error on Saturday August 4th and I pointed out that I thought they had changed data sources and on Tuesday August 7th they sent me a note agreeing that there was an error and they had, when I looked at their website, they had replaced the data for all1200 U.S. historical weather stations and they’d also replaced their U.S. temperature history. While they added a mention of me on their webpage describing their methodology, but didn’t provide any notice to readers that they had replaced all this data. So, for example, if you had been doing a study which required that you knew what the temperature was in Reno there was no notice that the data you’d had downloaded prior to August 2007 had contained an error. And in some cases a very large error.
When I looked at what their restated U.S. temperature history was, I noticed there was a change in the leading years. So, I wrote a light-hearted post on my blog that said there’s a change in the leader board at the U.S. Open and that even though people thought that the years 1934 and the years 1998 had been in the clubhouse and had a shower, in fact they were still on the course and that 1934 had a late birdie and 1998 had a late bogie and 2006 had a late-triple bogie and when the dust settled 1934 was now the leader of the U.S. Open.
Q. It seems at the heart of this was that NASA was unwilling to give you the methodology.
There are a couple of layers of issues. One issue was that they had an error. After I had identified this particular error to them and asked them for their source code so I could see how the rest of their adjustments actually worked, and this was really kind of an incidental point in checking their adjustment process. One of the things I started from was trying to evaluate whether their adjustment process was equal in adjusting bad data.
I think you can conclude from this exercise is that there adjustment software was obviously incapable of picking up fictionist jumps even as big as one degree centigrade in the year 2000 and the proof was in the pudding because they hadn’t picked it up. In fact, they hadn’t only failed to fix it, they created it.
So, they claim that their adjustment methodology was capable of fixing bad data, I mean, that’s the point I want people to take home from this. What they’ve done now is inserted a patch into an error that I identified for them but they haven’t established that the rest of their adjustment methodology is any good.
The adjustments are not small. The adjustments that they make are fully equal to the total amount of warming in the United States the past century. So, you’re dealing with adjustments that are the same size as the effect that you are trying to measure. So, it’s worth spending a minute or two trying to understand exactly what they did. Now, my interest in these things is understanding exactly what they did.
Now, their point of view is well, Gavin Schmidt of NASA says well “I don’t get this audit mean.” What he calls the audit mean. Well you know, everyone in the world, if you aren’t an academic and you’re doing business offerings or you work in a company, you get audited. And you can’t say to an auditor, here are the invoices, you do your own financial statements if you don’t like ours. Then, the auditor says my only interest how you did yours. So, when Gavin Schmidt says well you don’t think we’ve done an adjustment methodology, why don’t you do your own calculation and you can publish it, try to publish it in peer-reviewed literature and we can start from there.
My take is well, I’ve had other experiences with folks like that before and then they think if you mis-implemented their methodology they scream to high heaven. So, I said “No” and they said “You are asking to be spoon-fed” and I said “No, I’m not asking to be spoon-fed.” I’ll deal with raw code, it’s just that the verbal descriptions in academic articles to not meet the kind of engineering, quality level that I expect from things or that I am looking for and that represents one point of dispute between me and them. They don’t seem to accept the idea. This is an important issue and therefore academics have to stop being precious and arguing that these codes are their private property.
Q. If NASA were to handle this all better, or to your liking, what are some recommendations you’d give them?
One of the main recommendations I’ve consistently made both to NASA and to journals is that when people publish articles they should have to archive the data as they used it. The exact providence of their data if they downloaded it from an internet archive they should have to post the URL of the place where they got the data and the date they downloaded it so you can know the exact version they got in case the versions change. And, they should archive the code in which they obtained the calculations.
This is not by any means an impossible or far-fetched set of protocols. In econometrics right now, if you want to get an article published in the American Economic Review, a leading journal, that’s exactly what you have to do. That policy was instituted by the then-editor who is now chairman of the Federal Reserve System. It’s a policy that is easy to implement and there is a lot more riding right now on climate policy than there is on labor market studies or studies of inflation. So, I think there’s every reason to require NASA and other contributors to climate science to improve their game in terms of how they provide disclosure to other readers and other researchers of their methodology and data.
In some cases there are some real problems. You know Lonnie Thompson. the ice guy, has published sort of summaries of his data which are mutually inconsistent and I’ve tried to get original sample data to try and reconcile these and he’s refused and he’s published articles in journals and the journals have refused to require him to do it and the National Science Foundation which has funded it has refused to require it so it’s not just NASA it’s a very widespread problem in climate science right now.