One of the basic principles of blogging is to give credit when credit is due. I always try to mention the source of my information and most blogs do this using an acronym like h/t (hat tip). This week one of my own blog articles was not referred to as the source for another blog article on The Hockey Schtick. This blog then alerted WUWT? who brought the same news quite loudly and then also Steve McIntyre picked it up and Bishop Hill.

Now this is all fine, being given the credit for my blog post is not the most important thing in the world. However, a lot of noise in the blogosphere could have been saved if WUWT? had seen my original post. Hockey Schtick namely is writing about a “paper” presented at the EGU, which then became a “peer reviewed paper” on WUWT? (later corrected) which then irritated fellow Dutchman Victor Venema who happens to work on the same topic as the topic of my blog post, the homogenisation of temperature data.

A short history. In April I attended the EGU conference for the first time. I had lunch with Demetris Koutsoyiannis whom I spoke extensively in 2008 during the research phase of my Dutch book De Staat van het Klimaat. His work on long-term persistence is very interesting and also his analyses of climate models. I later also interviewed him for the Dutch magazine De Ingenieur. He invited me to attend the session in which his student Eva Steirou gave the now well-known presentation. He sent a link to the presentation shortly after the EGU but only this week I came about writing something about it.

It’s a pity that the posts on WUWT? and Climate Audit generated so much irritation, part of it caused by the misrepresentation of the status of the work (a presentation during a conference and not yet a peer reviewed paper). The topic itself is pretty important as the recent work of Venema shows.

Venema reacted on his blog. Some excerpts:

I have never seen an abstract that was rejected at EGU; rejection rates are in the order of a few percent and these are typically empty or double abstracts and are due to technical problems during submission. It would have been better if this abstract was send to the homogenisation session at EGU. This would have fitted much better to the topic and would have allowed for a more objective appraisal of this work. Had I been EGU convener of the homogenization session, I would probably have accepted the abstract, but given it a poster because the errors signal inexperience with the topic and I would have talked to them at the poster.

Now this reaction is a bit arrogant. I agree though that this session was the better place for the Steirou presentation. Steirou presented in the session that was organised by Koutsoyiannis himself. I will ask Koutsoyiannis to comment on this.

Now what are the major errors that Venema is talking about?

The first statement cited by Anthony Watts is from the slides:

of 67% of the weather stations examined, questionable adjustments were made to raw data that resulted in: “Increased positive trends, decreased negative trends, or changed negative trends to positive,” whereas “the expected proportions would be 1/2 (50%).”This is plainly wrong. You would not expect the proportions to be 1/2, inhomogeneities can be have a typical sign, e.g. when an entire network changes from North wall measurements (typical in the 19th century) to fully closed double-Louvre Stevenson screens in the gardens or from a screen that is open to the North or bottom (Wild, Pagoda, Montsouri) to a Stevenson screen, or from a Stevenson screen to an automatic weather stations as currently happens to save labor. The UHI produces a bias in the series, thus if you remove the UHI the homogenization adjustments would have a bias. There was a move from stations in cities to typically cooler airports that produces a bias and again this would make that you do not expect that the proportions are 1/2. Etc. See e.g. the papers by Böhm et al. (2001) Menne et al., 2010; Brunetti et al., 2006; Begert et al., 2005 or my recent posts on homogenization. Also the change from roof precipitation measurements to near ground precipitation measurements cause a bias (Auer et al., 2005).

Now these are some interesting arguments. Anthony Watts has frequently suggested that part of the warming could be due to the growth of airports. Now Venema is turning this argument around. The relocation of stations from cities to airports would actually lead to a cooling bias. I would be interested to see some examples.

Now the second problem that Venema has with the talk of Steirou is in this paragraph:

“homogenization practices used until today are mainly statistical, not well justified by experiments, and are rarely supported by metadata. It can be argued that they often lead to false results: natural features of hydroclimatic times series are regarded as errors and are adjusted.”

He defends homogenisation by writing:

The WMO recommendation is to first homogenize climate data using parallel measurements, but also to perform statistical homogenization as one is never sure that all inhomogeneities are recorded in the meta data of the station.

I was involved in the COST Action HOME, which just finished a study with a blind numerical experiment, which justified statistical homogenization and clearly showed that statistical homogenization improves the quality of temperature data (Venema et al., 2012). Many validation studies of homogenization algorithms have been published before (see references in Venema et al., 2012).

In a different approach, the statistical homogenization methods were also validated using breaks known in meta data in the Swiss (Kuglitsch, 2012). The size of the biased inhomogeneities is also in accordance with numerous experiments with parallel measurements; see Böhm et al. (2010) and Brunet et al. (2010) and references therein.

Definitely, it would be good to be able to homogenize data using parallel measurements more often. Unfortunately, it is often simply not possible to perform parallel measurements because the need for the change is not known several years in advance. Thus statistical homogenization will always be needed as well and as the validation studies show produces good results and makes the trends in temperature series more reliable.

I think based on my conversations with Koutsoyiannis that he and Venema more or less agree on this. As I said, Koutsoyiannis is very interested in long term persistence and he has found this kind of ‘behaviour’ in most if not all climatic time series. In the presentation they show that SNHT (the homogenisation protocol that they investigate) has the tendency to correct time series with long term persistence when no correction is needed. That is, the method detects an inhomogeneity that isn’t there, that just is part of the natural behaviour of a time series with long term persistence.

Now this is maybe the most interesting observation in the whole presentation and one that Venema should find very interesting as well. Much more work needs to be done as Koutsoyiannis this week already wrote me in an email.

Koutsoyiannis is always in favor of transparancy. Steven Mosher complained on one of the blogs that there is no station list available yet. I am convinced that Koutsoyiannis is willing to make this available and would have done so if a peer reviewed paper had been published.

The input of Venema who seems to be very active in the homogenisation community is very welcome of course. Hopefully this can lead to a more constructive exchange than so far is the case on the different blogs.

 

0 0 stemmen
Artikel waardering