The National Oceanic and Atmospheric Administration has abandoned an effort to reconstruct a detailed picture of hour-by-hour changes in the atmosphere stretching back to the 19th century.
Known as the 20th Century Reanalysis, the project has already helped scientists better understand the causes of historic weather events like the Dust Bowl of the 1930s and unusual Arctic warmth during the 1920s and 1930s. Those discoveries and others could eventually improve the predictions of climate models that look decades into the future.
But now the program, originally scheduled to run through 2013, is on hold as NOAA struggles with a large budget shortfall in its Climate Program Office.
Climate scientists are crying foul, calling the decision to pull the plug shortsighted.
Richard Seager, a climate scientist at Columbia University's Lamont-Doherty Earth Observatory, was part of the team that used 20th Century Reanalysis data to tease apart the drivers of the Dust Bowl. Now he's using the information to analyze the factors that produced historic flooding last year along the Ohio and Missouri rivers.
"[NOAA] defunded this at a time when everyone is using this dataset to look at long-term changes," Seager said. "To defund it now is a very strange decision."
Others lamented the cut but sympathized with NOAA's budget struggles.
"I understand where NOAA's coming from, because they didn't get the funds from Congress, but it's a very sad commentary on politics in this country," said Kevin Trenberth, head of the climate analysis section at the National Center for Atmospheric Research.
Going back in time to search forward
Researchers familiar with the program say its budget is vanishingly small, less than a tenth of 1 percent of the $4.9 billion NOAA received this year.
The 20th Century Reanalysis is one of several projects clumped into a $2 million line item that included $250,000 for reanalysis work at NOAA's lab in Boulder, Colo., and additional money for data managers at the agency's National Climatic Data Center in Asheville, N.C.
"It's a relatively inexpensive program," said Phil Arkin, director of the Cooperative Institute for Climate and Satellites at the University of Maryland. "And it does play a larger role than its cost would indicate. Reanalysis datasets are used constantly, all over science."
What scientists once called "retrospective analysis" -- these days shortened to just "reanalysis" -- was pioneered in the early 1980s at the European Centre for Medium-Range Weather Forecasting.
Researchers began knitting together observations of temperature, air pressure, wind speed and precipitation collected by ground-monitoring stations, weather balloons, aircraft, buoys, satellites and weather radar to go back in time, using computer models to create an hour-by-hour view of the atmospheric processes that drove past weather.
But that type of reanalysis -- conducted these days by ECMWF, NASA, NOAA's National Centers for Environmental Prediction and the Japan Meteorological Agency -- is limited by its reliance on satellite data that began pouring in during the 1970s.
Including the satellite data makes it hard to look much past the middle of the 20th century, said Dick Dee, head of the reanalysis section at the European Centre for Medium-Range Weather Forecasting.
The 20th Century Reanalysis is the first attempt to get around those limits and create a record that reaches far enough into the past to reveal processes that have shaped natural climate cycles, like the El Niño Southern Oscillation, as well as the drivers of man-made climate change.
The project does so by eschewing satellite observations, relying only on temperature and pressure data to reconstruct the climate in six-hour chunks from 1871 to 2010.
But even those data are limited. Modern temperature records began in the 1880s. By the 1930s, meteorologists had started deploying weather balloons. And radar was developed during World War II.
To fill in the gaps, the NOAA team has plumbed historical records of temperature and pressure collected by naval vessels, polar explorers' expeditions, commercial ship traffic, Army bases and even Jesuit monks.
Last year, the science team behind the 20th Century Reanalysis published the first journal article describing their approach. By the time it appeared in the Quarterly Journal of the Royal Meteorological Society, researchers had already used the team's data to probe the causes of the Dust Bowl, unusual Arctic warmth in the 1920s and 1930s, the forces driving notable El Niño and La Niña events and the frequency of Atlantic hurricanes.
But now, with the program's funding cut, the 20th Century Reanalysis team has abandoned plans to update its work with data covering 2011 and the period between 1850 and 1871.
One casualty: Texas drought studies
The project's co-leader, University of Colorado research scientist Gil Compo, said that means the reanalysis won't cover the full span of the historic drought that has gripped Texas since fall 2010.
Funding cuts will prevent researchers like Seager, who have used the 20th Century Reanalysis data to tease out the secrets of the Dust Bowl, from applying the same technique to the devastating modern-day drought.
"There will be other representations of the Texas drought, but none are being done with the same technique that would allow you to study the 1930s drought using one data set," Compo said.
For now, the scientists who developed the 20th Century Reanalysis Project and those who have relied on its data must wait to see what the next budget year brings. Last year, NOAA sought $5.5 billion but received $600 million less, with Congress mandating cuts to the agency's ocean, fisheries and research accounts.
The tussle over 2013 spending begins on Monday, when the White House will release its annual budget request to Congress.