Adjusting Pristine Data

On September 15, 2008, Anthony DePalma of the New York Times wrote an article about the Mohonk Lakes USHCN weather station titled Weather History Offers Insight Into Global Warming. This article claimed, in part, that the average annual temperature has risen 2.7 degrees in 112 years at this station. What struck me about the article was the rather quaint description of the manner in which temperatures are recorded, which I have excerpted here (emphasis mine):

Mr. Huth opened the weather station, a louvered box about the size of a suitcase, and leaned in. He checked the high and low temperatures of the day on a pair of official Weather Service thermometers and then manually reset them…

If the procedure seems old-fashioned, that is just as it is intended. The temperatures that Mr. Huth recorded that day were the 41,152nd daily readings at this station, each taken exactly the same way. “Sometimes it feels like I’ve done most of them myself,” said Mr. Huth, who is one of only five people to have served as official weather observer at this station since the first reading was taken on Jan. 1, 1896.

That extremely limited number of observers greatly enhances the reliability, and therefore the value, of the data. Other weather stations have operated longer, but few match Mohonk’s consistency and reliability. “The quality of their observations is second to none on a number of counts,” said Raymond G. O’Keefe, a meteorologist at the National Weather Service office in Albany. “They’re very precise, they keep great records and they’ve done it for a very long time.”

Mohonk’s data stands apart from that of most other cooperative weather observers in other respects as well. The station has never been moved, and the resort, along with the area immediately surrounding the box, has hardly changed over time.

Clearly the data collected at this site is of the highest quality. Five observers committed to their work. No station moves. No equipment changes according to Mr. Huth (in contrast to the NOAA MMS records). Attention to detail unparalleled elsewhere. A truly Norman Rockwell image of dedication.

After reading the article, I wondered what happened to Mr. Huth’s data, and the data collected by the four observers who preceded him. What I learned is that NOAA doesn’t quite trust the data meticulously collected by Mr. Huth and his predecessors. Neither does GISS trust the data NOAA hands it. Following is a description of what is done with the data.

Let’s begin with the process of getting the data to NOAA:

From Co-op to NOAA

Mr. Huth and other observers like him record their data in a “B91 Form”, which is submitted to NOAA every month. These forms can be downloaded for free from the NOAA website. Current B91 forms show the day’s minimum and maximum temperature as well as the time of observation. Older records often include multiple readings of temperature throughout the day. The month’s record of daily temperatures is added to each station’s historical record of daily temperatures, which can be downloaded from NOAA’s FTP site here.

The B91 form for Mohonk Lake is hand-written, and temperatures are recorded in Farenheit. Transcribing the data to the electronic daily record introduces an opportunity for error, but I spot-checked a number of B91 forms – converting degrees F to tenths of degree C – and found no errors. Kudos to the NOAA transcriptionists.

Next comes the first phase of NOAA adjustments.

NOAA to USHCN (part I) and GHCN

The pristine data from Mohonk Lake are subject to a number of quality control and homogeneity testing and adjustment procedures. First, data is checked against a number of quality control tests, primarily to eliminate gross transcription errors. Next, monthly averages are calculated from the TMIN and TMAX values. This is straightforward when both values exist for all days in a month, but in the case of Mohonk Lake there are a number of months early in the record with several missing TMIN and/or TMAX values. Nevertheless, NOAA seems capable of creating an average temperature for many of those months. The result is referred to as the “Areal data”.

The Areal data are stored in a file called hcn_doe_mean_data, which can be found here. Even though the daily data files are updated frequently, hcn_doe_mean_data has not been updated in nearly a year. The Areal data also seem to be stored in the GHCN v2.mean file, which can be found here on NOAA’s FTP site. This is the case for Mohonk Lake.

Of course, more NOAA adjustments are needed.

USCHN (part II and III)

The Areal data is adjusted for time of observation and stored as a seperate entry in hcn_doe_mean_data. TOB adjustment is briefly described here. Following the TOB adjustment, the series is tested for homogeneity. This procedure evaluates non-climatic discontinuities (artificial changepoints) in a station’s temperature caused by random changes to a station such as equipment relocations and changes. The version 2 algorithm looks at up to 40 highly-correlated series from nearby stations. The result of this homogenization is then passed on to FILNET which creates estimates for missing data. The output of FILNET is stored as a seperate entry in hcn_doe_mean_data.

GISS wants to use the data, but the NOAA adjustments are not quite what they are looking for. So what do they do? They estimate the NOAA adjustments and back them out!

USHCN and GHCN to GISS

GISS takes both v2.mean and hcn_doe_mean_data, and lops off any record before 1880. GISS will also look at only the FILNET data from hcn_doe_mean_data. Temperatures in F are converted and scaled to 0.1C.

This is where things get bizarre.

For each of the twelve months in a calendar year, GISS looks at the ten most recent years in common between the two data sets. For each month in those ten most recent years it takes the difference between the FILNET temperature and the v2.mean temperature, and averages them. Then, GISS goes through the entire FILNET record and subtracts the monthly offset from each monthly temperature.

It appears to me that what GISS is attempting to do is remove the corrections done by NOAA from the USHCN data. Standing back to look at the forest through the trees, GISS appears to be trying to recreate the Areal data, failing to recognize that v2.mean is the Areal data, and that hcn_doe_mean_data also contains the Areal data.

Here is a plot of the difference between the monthly raw data from Mohonk Lake and the data GISS creates in GISTEMP STEP0 (yes, I am well aware that in this case it appears the GISS process slightly cools the record). Units on the left are 0.1C.

Even supposedly pristine data cannot escape the adjustment process.

24 Comments

  1. compguy77
    Posted Sep 24, 2008 at 8:38 AM | Permalink

    Any idea where Mr. Depalma got the information that this site showed a 2.7 degree increase in temp?

    Also, a chart of the raw vs. adjusted data over the time period would be nice if you could.

    • Follow the Money
      Posted Sep 24, 2008 at 5:38 PM | Permalink

      Re: compguy77 (#1),

      Any idea where Mr. Depalma got the information that this site showed a 2.7 degree increase in temp?

      These stories are not the product of traditional investigative journalism, but packaged to media sources, the reporter doing little leg work. Often an article will give a clue to its source. Environmental Defense Fund is mentioned, a major player in carbon offsets and cap and trade. Perhaps an EDF publicist contacted the Thomas Friedman at the Times or other “opinion shaper” there, perhaps the major public relations firm managing this show matched the EDF to the reporter. IOW, this reporter did not find this information on the streets of New York City.

      Why did they plant the article? It is a reaction to growing knowledge amongst the plebes that the temp records are problematic. This sense is heightened via CA, Watts’ site, and so. This article serves as holy writ to counter such questions and provide an artefact of information in the holy source, the NYT. “But the New York Times said it doesn’t matter. “But the New York Times said…” It is a planted talking point in a source still considered the “last word” in American journalism.

      The recent BBC program served a similar duty, but with a broader ambit. It was not progressive but reactive, reacting to key bones of contention bubbling through the public, esp. the British public whose awareness of the issues coincident with the increases in their taxes. It’s intention was to stifle, ridicule, and reaffirm.

  2. PhilH
    Posted Sep 24, 2008 at 9:05 AM | Permalink

    I trust that you are sending a copy of this to Mr. Depalma, together with compguy’s question.

  3. Lune
    Posted Sep 24, 2008 at 9:07 AM | Permalink

    If I read your last chart correctly, it slightly cools the older record and slightly warms the modern period, accentuating any 20th century warming trend.

    Reply: No, for Mohonk Lake, GISS is warmer for the older years than the raw data, and cooler for the later years.

  4. rlund
    Posted Sep 24, 2008 at 9:25 AM | Permalink

    I am certainly new to this; but why is a TOB “ajustment” needed. Isn’t the minimum temperature and the maximum temperature the min and max temperature? EG. if a cold front is coming through and causes a Tmin of 30F at noon, what is the rational for changing that number? I guess I just don’t get it.

    • Andrew Cochrane
      Posted Sep 24, 2008 at 10:47 AM | Permalink

      Re: rlund (#4),

      Adjustments are necessary to get the ‘right answer’. (Wink).

      • Luis Dias
        Posted Sep 24, 2008 at 11:10 AM | Permalink

        Re: Andrew Cochrane (#10),

        Paranoia is unwarranted in this case. It seems that the various workarounds chained together make a very distinct graph, but without any intent a priori. I could be wrong on this of course, but from all I’ve seen, this seems to be the case.

        The main point remains one of GIGO (Garbage In Garbage Out): how can you model the earth’s temperature when you can’t measure it properly?

        Either way, as in the “The water of Spain rains in the Plains” post, this is but a pixel on the larger context. A greater inquiry to what’s going on at all the termometers should get us a better clue to the real error we’re exposed to.

    • BarryW
      Posted Sep 24, 2008 at 12:50 PM | Permalink

      Re: rlund (#4),

      Here’s a tutorial on TOB that might help.

      • Geoff Sherrington
        Posted Sep 26, 2008 at 12:12 AM | Permalink

        Re: BarryW (#13),

        TOBS again. The tutorial appears to be wrong because when the thermometers are read, they are RESET to the present temperature. Using the example given, there is no mention of the time of day that they were read and reset. But, if the day 1 minimum of -5.6 at 8 am was read and reset at 11 am on day 1 (to pluck a time from the air), then the -5.6 figure was recorded as a correct min and then erased, to start afresh with another value (-0.6). It is -0.6, not -5.6, that carried over towards day 2.

        If you wish to argue that a reset at 8 am on day 1 was done, yes, it would be there on day 2 at 8 am when the actual min was -3.3. But, the very act of resetting at 8 am would ensure that the max from day 1 was correct and more than likely the max on day 2 was also correct if resetting was also at 8 am on day 2 (on the basis that for each day the max happens after 8 am).

        The argument in the example is special pleading, including a case where the resetting was done at a daily extreme. Leaving the data alone could well be preferable to referencing to midnight readings by maths infilling, because the thermometers were typically NOT reset at midnight. Midnight is an artificial construct able to introduce errors of its own.

  5. W F Lenihan
    Posted Sep 24, 2008 at 9:48 AM | Permalink

    Can someone explain why weather stations don’t have both Celsius and Fahrenheit thermometers to avoid errors from conversion and reconversion of temperatures?

    • Rod Smith
      Posted Sep 24, 2008 at 10:19 AM | Permalink

      Re: W F Lenihan (#5),

      The conversions/corrections are done at “headquarters,” not in the field. I would expect that all such conversions are automatically done by the data entry program(s), but I have certainly been wrong in expectations before.

      NOAA was one of the early adopters of computers. As I remember, their first computerized forecast was on an IBM “Stretch,” a late 60’s mainframe. (It wasn’t even ‘NOAA’ in those days!)

      Better questions might be: How accurate are the thermometers, and have they ever been calibrated? How often are they calibrated? How often is the site inspected and re-certified? And so on…

  6. Joe Black
    Posted Sep 24, 2008 at 10:09 AM | Permalink

    I’d just like to point out again that February is overrepresented in “mean” annual temperatures as all months are weighted equally without regard to the number of readings (days) per month. Butt, hay….

  7. Joe Black
    Posted Sep 24, 2008 at 10:13 AM | Permalink

    Somebody has applied seasonal adjustments along the way as shown by:

  8. Joe Black
    Posted Sep 24, 2008 at 10:15 AM | Permalink

    Somebody has applied seasonal adjustments along the way as shown by:

    AZ_Childs Data

  9. Sam Urbinto
    Posted Sep 24, 2008 at 12:35 PM | Permalink

    We measure the temperature but we don’t really measure the temperature. I think of it in terms of hourly 1 C resolution samples of the air as a proxy for the land as a proxy for the area, resulting in the averaged high and low points of the day. Then of course the however often sampled upper ocean in shipping lanes as a proxy for the ocean. And so on. Then adjusted, re-averaged over and over, combined, processed, and turned into an anomaly combining various measurement methods over time into something that’s supposed to be something.

    Um, what was I talking about.

    Oh, why do you need to adjust pristine data, again?

  10. Chad
    Posted Sep 24, 2008 at 1:13 PM | Permalink

    Great post. I love all those colorful graphics used to illustrate the steps involved. I especially like the rabbit in the hat.

  11. R DeWitt
    Posted Sep 24, 2008 at 4:46 PM | Permalink

    I think you would avoid some confusion in your graph if you used something other than a minus sign to separate the location “Monock Lake” from the equation representing how the difference was computed. It took me a while to understand how I and Lune were misinterpreting the graph.

    Reply: Yeah, I know. Bad habit. I should have used a colon.

  12. Posted Sep 25, 2008 at 4:20 AM | Permalink

    I got the impression that the weather station pictures that littered one episode of the BBC Iain Stewart series, were taken from Anthony Watts’ superb records. He said somewhere earlier he’d been contacted by the BBC. If so, this would be surely another misrepresentation a lot bigger than Wunsch’s complaints re Swindle.

  13. GTFrank
    Posted Sep 25, 2008 at 10:41 AM | Permalink

    #1, #16

    Any idea where Mr. Depalma got the information that this site showed a 2.7 degree increase in temp?

    Benjamin I Cook’s paper is here.

    Click to access 2007_cook.pdf

  14. MarkB
    Posted Sep 25, 2008 at 2:39 PM | Permalink

    I grew up in the Jamaica Plain section of Boston, USA, in the 1960s. As far back as the 1700s, people skated on Jamaica Pond, a 70 acre, 50 feet deep kettle pond. in the 1800s, there was an ice-cutting business taking ice from Jamaica Pond and other local ponds and lakes that shipped ice as far as Asia. The ice houses on the pond could hold 30,000 tons of ice at a time.

    As late as the 1930s, my parents skated on the pond in the winter. By the time I was growing up in the ’60s, the pond was closed to skating. Many years since, the pond fails to freeze over and close up open water. When they were harvesting ice, they liked to get at least 12 inches before they cut. The difference is dramatic, and tells me – without yearly data – that the Boston area is warmer now than in the years from the mid-late 1700s and the 1930s. I don’t know the difference, but it must be significant to generate – or not – 12 inches of ice on a pond that size.

    That said, the warming observed tells me nothing about the cause of the warming. There has definitely been a warming trend in the Boston area, but I have no idea why. Correlation, as they told me in statistics class, is not causation. No ifs, ands or buts.

  15. MarkB
    Posted Sep 25, 2008 at 2:40 PM | Permalink

    Oops! The link above still works.

  16. HMcCard
    Posted Sep 25, 2008 at 4:25 PM | Permalink

    John Goetz,

    (I tried to post this cmment on Anthony’s website but it didn’t go through for some reason; I’ll try submitting it here.)

    I’m sure that you know that the daily temperature data from the Mohonk Lake surface station can be found for the interval 05/1948 through 12/2005 at

    http://cdiac.ornl.gov/cgi-bin/broker?_PROGRAM=prog.climsite.sas&_SERVICE=default&id=305426

    I selected a random sample of ten completed B91 Forms for Mohonk Lake from the site that you referenced

    http://www7.ncdc.noaa.gov/IPS/coop/coop.html?_page=2&state=NY&foreign=false&selectedCoopId=305426&_target3=Next+%3E

    and verified that the values for TMAX(F) and TMIN(F), with one exception, were identical. The exception which occurred on 07/12/1956 is an obvious transcription error; the value (67°F) listed in the “At OBSN” column was recorded for TMIN(F) instead of the correct value (64°F). In addition to TMAX(F) and TMIN(F), the USHCN site also lists TAVE(F) which equals [TMAX(F) + TMIN(F)]/2 rounded to the next higher integer when either TMAX(F) or TMIN(F) is an odd integer. Therefore, I believe the data archived at the site that I referenced is the “RAW” daily temperature data for station 305426.

    So far, so good …

    I then calculated the monthly averages for TAVE(F), TMAX(F) and TMIN(F), for the interval 05/1948 through 12/1959 and compared the results with the corresponding monthly values that I downloaded from

    http://cdiac.ornl.gov/cgi-bin/broker?_PROGRAM=prog.climsite_monthly.sas&_SERVICE=default&id=305426

    How did they compare? Not so good!!

    The average differences for the 139 month interval are (s.d. shown in parentheses):

    RAW TAVE(F) – HCN TAVE(F) = 1.39°F (0.43°F)
    RAW TMAX(F) – HCN TMAX(F) = 1.43°F (0.62°F)
    RAW TMIN(F) – HCN TMIN(F) = 0.83°F (1.17°F)

    What adjustments do you think were made by NCDC that caused these differences?

    I may add to this post after I complete my examination of the 1948:2005 RAW data set.

    Reply: Anthony’s spam bucket tends to grab stuff with a high link-to-text ratio. I think your post was above whatever that limit is. It looks like one of the moderators did find it and let it get through.

    As for what you are seeing, I suspect it is homogenization, TOBS, and FILNET differences.

  17. gopalan
    Posted Sep 26, 2008 at 12:43 AM | Permalink

    W F Lenihan:
    September 24th, 2008 at 9:48 am

    Can someone explain why weather stations don’t have both Celsius and Fahrenheit thermometers to avoid errors from conversion and reconversion of temperatures?

    Most reasonable doubt. Men will do the rational thing, but only after exploring all other possibilities!

  18. Jim Rogers
    Posted Mar 22, 2009 at 11:19 AM | Permalink

    We all know that the average of Max/min is not a good indicator of average daily temperature, showing a warmer daily mean in the winter and a lower daily mean in the summer than compared to hourly or 6 hour avg. (temperature stays lower for most of the day in winter with a max peak and vice versa). Why not run the analysis using only max and min daily temperatures? This would remove error introduced by different month lengths for example. The reason we use to use the means was the lack of computing capability. I use this for some environmental assessment analysis with Kugluktuk/Coppermine data (1932-2006). The trend of max temps rose more quickly than min in that case but still close to the 0.3 C/decade. I suggest ignoring the daily data fills, which appeared bias. Automatic remote data collection was visible.