Downscaling relies, in part, on comparing the observed historical relationships between local and large-scale weather to make global models relevant to the local level. But downscaling is a tricky business, even in a heavily instrumented data-rich place like California.
For example, some historical records show that coastal California had cooled over the past few decades even while the rest of the state warmed up. The contrast between cooling and warming is counterintuitive to climate trends and suggests less warming in coastal California in the future. This anomaly raised a red flag for Dan Feldman of Berkeley Lab and his colleagues at Indiana University. So they decided to investigate. The team found that while the observations of the past decades are robust, the historical records built from those observations include potentially compromised data.
“There’s a lot of great information in those historical temperature records,” said Feldman, a staff scientist in the national lab’s Earth and Environmental Sciences Area, “but for any large long-term temperature record, the technology will change.”
Some climate records are more than a century old, meaning the record needs to consider not just what was measured, but also how. Older weather stations may have disruptions in the data. Some datasets, especially in cities, have weather stations that moved over time to accommodate land-use changes – like a highway or housing project built nearby. Feldman and his colleagues discovered that the historical records that do carefully consider how temperature data are collected do not show coastal cooling. This finding has major implications for future projections of downscaled coastal temperatures in California.
Their results, published in Geophysical Research Letters, show that there are major differences in the six historical datasets they looked at exactly because some datasets consider how temperature records were collected and some do not. Two of the models used homogenized data sets, which took into account how temperature records were collected. Four used non-homogenized data sets, incorporating only raw data without taking into account how temperature records were collected. Rather, evidence of coastal cooling in California only showed up in the non-homogenized data.
The study demonstrates the importance of data management when downscaling. Simply choosing a model that worked well with historical observations could open the door for serious errors, if you don’t also consider how the observations were collected. The apparent, but ultimately artificial, California coastal cooling that inspired this study is one example of such an error. By carefully looking at both the data and how it is collected, climate modelers can provide local users with more accurate information.
Homogenized data that’s appropriately downscaled will allow cities and communities to better plan for a warming climate. Likewise, uninterrupted data paint a more realistic picture of temperature trends. “There is always a range of projections for what climate conditions will be in the future, and it’s easier to look at the lower end of the range and hope that’s what’s going to happen,” said Feldman. “I think it’s harder to look at the upper end of that range and plan for that, but we have to make sure that the lower end of the range isn’t overly optimistic and the upper end of the range isn’t overly pessimistic.”