LINDAU, Germany — For many scientists, the path to the ultimate prize in their field is long, frustrating and convoluted.
Gathered on this island in a clear Bavarian lake, dozens of gray-haired physicists, chemists and biologists echoed this theme before hundreds of young researchers eager to catch some of the spark.
Eric Betzig, who received the 2014 Nobel Prize in chemistry for his work on super-resolution microscopy, recalled that he had his eyes on the prize from the outset.
But venturing into lonely uncharted territory soon led to pitfalls — results that didn’t materialize, approaches that didn’t work — and Betzig felt so defeated that he left science altogether.
"I really felt like what I was doing was a net negative to society, because it was a waste of time and a waste of taxpayers’ money," he said on stage.
Instead, he lived as an unemployed househusband for several years before joining his father’s machinery company, where he spent more than $1 million developing a tool that was only bought twice.
He eventually returned to research at the Howard Hughes Medical Institute’s Janelia Farm Research Campus and succeeded in developing a breakthrough microscope system that uses light to see individual molecules within a living cell, allowing scientists to see how embryos form, diseases progress and neurons send messages.
The meandering route to discoveries like this may be difficult for other scientists to follow and even more difficult for agencies to fund, but it’s the kind of effort that leads to the breakthroughs that illuminate darkness of the unknown.
Pressure for immediate solutions
Such scientific leaps are needed more than ever: Dire problems like climate change demand immediate solutions for cutting greenhouse gases on a global scale while strained budgets require fiscal efficiency.
The question for policymakers in the United States is where to place your bets.
In a world of finite funding, there is a risk of trading the science that wins Nobels for the science that gets patents. Companies and governments around the world are racing for improved energy storage, more efficient thermostats, cheaper solar cells and cleaner combustion, and with average temperatures rising, funding agencies are losing patience for whimsical curiosity-driven pursuits that strike out far more often than they get on base.
Yet it’s often the absent-minded professors, the laboratory mistakes or the unexpected results that completely change the game in a science, so the tension now is in whether to pursue incremental gains or swing for the fences.
The U.S. Department of Energy, the nation’s largest funder of physical sciences research, says it has been trying to do both, stemming from an ultimate goal to fight climate change.
"We really try to work across the full range of fundamental science all the way to energy applications," said Lynn Orr, undersecretary for science and energy at DOE.
However, since the recession, the department has remodeled itself into an economic engine, with a renewed focus on applied energy research with definite objectives — like solar panels that reach cost parity with fossil fuels — as well as early-stage commercialization efforts though programs like the Advanced Research Projects Agency-Energy and scaling up projects through the loan guarantee program.
Of the $27.4 billion enacted DOE budget for 2015, about $5 billion went to the Office of Science, while $3.7 billion went to applied energy initiatives in the Energy Efficiency and Renewable Energy division or in fossil energy programs.
Applied research has the advantage of fixed timelines and targets. Make a solar panel cheaper and more efficient, and you keep your funding; miss the deadline, and you lose it. Using this strategy, DOE has cast a wide net over a variety of energy strategies, expecting that many won’t pan out, but the ones that do will transform industries.
Or, as former Energy Secretary Steven Chu, another Nobel laureate, put it during a 2012 briefing: "It’s OK to fail, but do it fast and move on."
The problem is that fundamental research doesn’t succeed or fail in the conventional sense, so it’s difficult to gauge progress until there’s a breakthrough or until a scientist throws up her hands, bangs her head against a wall and walks away.
"There’s the old joke that research is what we’re doing when we don’t know what we’re doing," Orr said.
This kind of thinking is a tough sell for lawmakers. On Capitol Hill, the House Science Committee in July proposed adding a national interest provision for research grants from the National Science Foundation, requiring grant recipients to justify how their work will benefit the United States, which may make theoretical work harder to fund.
Fear of failure can kill innovation
Meanwhile, DOE has been clamoring for more funding for basic sciences, to little avail.
"We are probably two or three times underfunded, especially in the early stages of the innovation pipeline," Energy Secretary Ernest Moniz said during a conference call in August. "We keep proposing to Congress to increase that."
Fear of failure and the political consequences thereof also keep ambitions in check. DOE’s loan guarantee program, for example, touts a 97 percent solvency rate, which suggests that the projects may have been tempered by high-profile, contentious flameouts from past recipients like Solyndra (ClimateWire, Dec. 16, 2013).
In a report from the Massachusetts Institute of Technology on the "innovation deficit" published earlier this year, authors noted that private industry built its own institutions for basic research, like AT&T’s Bell Labs, which spawned eight Nobels, including Chu and Betzig.
"But over the past few decades, international competitive pressures and the short term focus of the financial sector have caused U.S. industry to move away from long-term investments in R&D and to essentially eliminate corporate sponsored basic research," they wrote.
The concern is to make sure tax dollars are spent correctly. "You can’t really schedule a discovery, but you can sure as heck follow up on it," Orr said.
Energy storage is a case in point: Storing electrons and heat for later solves a major problem for intermittent renewable energy sources on the grid, so utilities are no longer beholden to the whims of the sun and the wind. In cars, better batteries are essential for making motors that run on electricity competitive with engines that run on fossil fuels.
But the pace of improvement has been agonizingly slow.
"The amount of energy per unit volume has increased about three or four times," said Chu, now a professor at Stanford University. "What has not been making a lot of progress is energy per unit weight, and that’s the essential thing in a car."
To meet this need, Chu in 2012 launched the Joint Center for Energy Storage Research, a $120 million Manhattan Project-like program to engineer the battery of the future (ClimateWire, Dec. 5, 2012).
However, engineers may be taking some of the fundamental physics behind batteries for granted. Researchers last year found that in batteries, the prevailing understanding for how energy moves in the cell, the Butler-Volmer model, didn’t fit new types of battery electrodes.
Theory builds a better battery
Instead, scientists found that the Marcus-Hush-Chidsey theory better matched the performance in these devices, indicating that the speed of energy transfers in electrons is more important to the battery’s performance than energy transfers in ions (ClimateWire, April 8, 2014). This has implications for how to engineer future batteries.
Rudolph Marcus, the theory’s namesake and a chemistry professor at the California Institute of Technology, received the 1992 Nobel Prize in chemistry for his research on electron transfer reactions. He explained that though his work rippled through energy storage, it wasn’t his goal to have any effect on the field, a fact that policymakers should keep in mind.
"There certainly is an amount of pressure that there wasn’t before to see applications of results," Marcus said. "One can understand society wanting to see results that are applied — and that’s absolutely necessary — but there’s a lot that’s come out of basic research that’s sheer curiosity."
The same consideration applies to how we invest in the next generation of scientists. "It’s hard to get money to do blue-sky research," said Christopher Chidsey, an associate professor of chemistry at Stanford, who worked with Marcus on electron transfer. "Most graduate students are working on applied problems for that reason."
"If you let the applications drive where the basic sciences go, you may not find a reason to do it. You may miss a whole lot of interesting science," he added.
The basic versus applied research debate echoes a broader climate strategy fight over whether it’s better to run with the technologies we have now or sit tight for better batteries, turbines and carbon scrubbers (ClimateWire, May 3, 2013).
And the intense focus on applied research raises concerns that technology improvements are creating a moral hazard when it comes to climate change, leaving people content that humanity will invent its way out of the problem.
That complacency threatens to undermine targets of keeping global warming below 2 degrees Celsius. The International Energy Agency found earlier this year that none of the clean energy fields it tracks on a global scale is meeting its climate objectives.
"As a result, our ability to deliver a future in which temperatures rise modestly is at risk of being jeopardized," the agency’s executive director, Maria van der Hoeven, wrote in a paper summarizing the findings.
"The world is not moving fast enough, it’s not moving hard enough, and I am getting more concerned," Chu said. "We should also not think it’s too late. ‘Too late’ is an easy excuse to say, ‘I’m not going to try anymore.’"