First off, the UK. I've used the Central England Temperature (HADCET) dataset, available here. Monthly means are used, from 1659 to date this is the longest dataset of monthly temperatures, it is based on a selection of locations in England (PDF table 1a).

I've calculated the full year average for 2014 on the assumption that December 2014 is average for the 30 year period 1984 to 2013. Based on this assumption 2014 is seen to be the warmest year since 1659.

As can be appreciated from the above graph, local weather strongly affects temperature, although the overall warming trend due to anthropogenic global warming is clearly evident. The difference between the January to November and January to December average temperatures has been calculated, the average being 0.47degC, the standard deviation being 0.15degC. For Decembers from 1984 to 2013 the average temperature has been 4.86degC, at present temperatures are probably running about that, but I note from the forecast that milder weather is due to set in over the coming weekend. In order for a new record to be set by January the December average merely has to exceed about 4.1 degC (which would mean 2014 would meet the previous warmest year in the HADCET record; 2007).

In the UK for December 4.1degC is rather cool. Only 7 years for the 30 years from 1984 to 2013 were colder, of those the coldest was December 2009 (-0.7degC), the start of an exceptionally cold winter. These seem to be associated with a state of weakened westerlies, with the difference from long term average being an easterly flow, whether December sees such a state develop remains to be seen. Forecasts to mid December do not suggest it is likely.

So it is not a certainty, but from the Central England Temperature series it looks like in January 2015 we may be hearing on the news that 2014 was the warmest year in England since 1659.

Using the January to October average isn't as poor a choice as it might seem at first glance. The average difference between January to October and January to December averages is 0.00degC, showing the difference to be well balanced around zero, the maximum is 0.04degC, the minimum is -0.05degC. For the 2014 January to October average of 0.66degC, the above maximum and minimum represent deviations of +6% / -7%.

The previous maximum anomaly of annual average global temperature in GISS LOTI was 0.66degC in 2010, at 0.66degC in 2005 and at 0.62degC in 2007 so far three years exceed the 1998 high of 0.61degC. 2014 could easily become the fourth year to exceed the 1998 record, a record set with the help of a super-El Nino, such a feature did not aid those successive three warmest years.

Assuming that the probability distribution is rectangular, with the +6% / -7% uncertainty applied to the 0.66degC supposed for 2014 gives a 2/3 probability of this year not beating 2010, and 1/3 of it beating 2010 and making a new record. For 2014

*not*to be the fourth year to beat 1998 would require the difference between the January to October average and the full year January to December average to be the lowest since 1880.

So the odds look reasonable for the UK Central England Temperature series and the NASA GISS LOTI series to be new record highs this year. Even without such new records the data will add to the more important upwards trends in both UK and global temperature. Trends are not just about records, and it's trends that matter.

***

Before I click to post this, it just occurred to me to look at NCEP/NCAR surface temperature for the whole globe, which is now available to November 2014. So for the sake of idle curiosity, here's January to November NCEP/NCAR Reanalysis surface temperature for the globe.

Source.

## 8 comments:

I feel a need to point out that of course the average difference between the Jan-Oct anomaly and the Jan-Dec anomaly is zero. That's the definition of an anomaly.

The important thing here isn't the mean change. We know that's zero. The important thing is the conditional mean, or the mean change considering the correlation with previous months. If the October monthly anomaly is high, then the November and December anomalies are likely to be high, meaning that the annual average is likely to go up.

The big difference between this year and previous record years is that most record years started higher than average and then became more typical as the year went on. This year is the reverse. It has gotten hotter as the year has progressed. The probability that this year will be record breaking is considerably higher than your calculations show.

My lay impression of the latest NOAA update, particularly the chart "Sub-Surface Temperature Departures..." that shows the progression of the Kelvin wave, is that ENSO is a stronger contributor to surface temperature anomaly in Nov/Dec than in the rest of 2014 on average.

http://www.cpc.ncep.noaa.gov/products/analysis_monitoring/lanina/enso_evolution-status-fcsts-web.pdf

OMSM,

Not sure I follow you, but perhaps I should have been more clear. The GISS LOTI baseline average period is 1951 to 1980. When I describe the 'tidy' behaviour of the difference between Jan-Oct and Jan-Dec anomalies (e.g. 0degC average difference and near balance on the max/min) this is for the entire period 1880 to 2013. So that behaviour doesn't seem to me to be an outcome of the use of anomalies because it is for a period greater than the lesser subset used to calculate the anomalies. FWIW, the average of the Jan-Oct and Jan-Dec difference from 1880 to 2013 is 0.000437. Calculating the same average for 1951 to 1980 (the period of the GISS baseline) gives -8.3*10^-5, which can be considered an artefact of rounding error, and is as you caution (as near as damn it) zero.

Iceman,

Thanks for that paper. I'm not more knowledgeable than you, but from "Weekly Heat Content Evolution in the Equatorial Pacific" it seems the current downwelling Kelvin wave may be near an end and cooling upwelling could start. But yes it does look like a warming in Nov/Dec may be ongoing.

I'm not sure when in the month GISS updates, but GISS LOTI has not yet updated with November data.

I can try again. I had two basic points. Stating the first one more generally, if you do a bunch of computations on some data and get a result like 0.000437, (or otherwise close to zero) that is usually telling you more about your computations than about your data. I would be extremely reluctant to draw conclusions about the behavior of new data based on this result, unless I've worked through the meaning of the math. Results where the calculations are nearly 0 but clearly not zero, like 0.1, are more reliable for predictions than when the result is almost exactly 0.

My second point, about the correlations in the data, is essentially the same as what iceman was saying. If you look at the change in anomaly over the course of the year, most previous record years had high monthly anomalies early in the year due to ENSO effects, and the anomalies fall during the course of the year.

This year, the high monthly anomalies have come at the end of the year, again possibly due to ENSO effects. Because monthly anomalies tend to be persistent (or are correlated), it is likely that the November and December anomalies will be quite large.

Therefore, if you consider the recent monthly anomalies, it is more likely that this year will be record breaking than you would conclude just by considering the total year to date anomaly.

OSMM, (got the acronym right this time!)

I read something different into the close to zero, but not zero, behaviour of Jan to Oct and Jan to Dec difference.

The difference during the anomaly period is virtually zero, allowing for expected rounding error summed over 30 years. OK, that's fine, I think we agree what is going on there. But over the longer period 1880 to 2013? Here is what I think is happening.

The factors that lead to global average temperature smooth out the short term variance seen in a more localised dataset, such as the CET, after all this is a global average. So whatever factors have established the anomaly in the first ten months of the year are, on average, likely to persist for the final two. This is wholly unlike the rapidly varying behaviours seen on a local scale such as England. Thus it should not be a surprise that for anomalies varying from -0.2 to +0.6 (from memory) the 10 to 12 month difference is merely +/-0.04(ish), about 1/25 ratio.

However it does mean that GISS at least seems to be amenable to prediction once the majority of the year has passed.

I used a rectangular distribution for my estimate of likelihood of this year being a new record in GISS. This is probably wholly unlike the actual probability distribution. But it does for a rough back of envelope calculation. I can quite believe that the probability is higher than the 1/3 I calculate with that rough method.

I am still awaiting the update for November GISS.

Rather than just talk I'll do some maths.

OSMM,

You're right. If I take NCEP/NCAR global temperature, because it is to hand, taking anomalies has the effect of massively reducing the difference between successive overlapping averages.

Post a Comment