The worst computer vulnerability in recent years was in a ubiquitous piece of open-source software — a bug that was as simple to exploit as it was difficult to patch.
The Apache Log4j security flaw opened the door to millions of computers, but the extent of the damage still isn’t fully understood. Nearly a year later, federal officials and Congress are still discussing how to avoid another potential disaster.
Open source, which is code that is “open” to everyone to use or edit, can be found in nearly every type of modern technology. It has served as the backbone of the internet, and is pervasive throughout the economy — including in the energy sector.
That makes it a looming issue for energy cybersecurity.
“Of course, [the Energy Department] is concerned about open-source software,” said Cheri Caddy, a former senior adviser at DOE who is currently director of cyber policy and plans at the Office of the National Cyber Director. “Open-source software is a part of all software development, whether it’s [operational technology] or IT. It’s just ubiquitous in everything now.”
The Log4j security lapse highlighted some of the key concerns: The development team was small, the software was found in nearly every industry, and many companies were unsure if they even had the code in their products.
The problem, experts say, is not that open source is inherently less secure than proprietary software. It’s not. But a few lines of code can be adopted throughout an entire industry.
When those few lines contain a serious vulnerability, that can be a problem for critical infrastructure, including the grid. It can become an open door that allows malicious hackers to walk into critical systems — especially when utilities aren’t aware that the door even exists.
Open source is everywhere
In the energy sector, open-source software is everywhere, said Virginia Wright, an energy cybersecurity portfolio program manager at Idaho National Laboratory (INL).
Wright manages a DOE grid vulnerability testing bed called Cyber Testing for Resilient Industrial Control Systems (CyTRICS). The program, run by six DOE labs and led by INL, ferrets out vulnerabilities in the software that runs the power grid.
“One hundred percent of the systems that we have looked at have contained open-source software,” Wright said.
CyTRICS works on a voluntary basis with some of the biggest grid equipment manufacturers, like Hitachi Energy and Schweitzer Engineering Laboratories. Once a vulnerability is found, the lab reaches out to the manufacturers with potential mitigation measures to help patch the bug.
Sometimes that includes publicly known vulnerabilities. Because open-source software is freely available and widely used, vendors may not be aware that a vulnerability and patch even exist, Wright said.
Wright said that the labs have seen grid equipment vendors selling older versions of their products with known vulnerabilities and fixes. Some of that software is even updated in those vendors’ own systems, and their customers are “buying it with all of the vulnerabilities attached,” Wright said.
To avoid software with vulnerabilities, utilities “need to employ a pretty rigorous evaluation and testing process on their own,” she said.
The bipartisan infrastructure bill codifies and places the CyTRICS program under the Cyber Sense program. By September of next year, DOE aims to analyze around 10 percent of critical components in energy systems and expand the program’s voluntary partnerships to cover around 15 percent of market share, according to DOE’s two-year performance goal.
DOE also launched a pilot program for an energy-focused “software bill of materials,” which is similar to the food industry’s ingredient label. Such a label, experts say, can increase visibility into the software that runs critical infrastructure.
Congress also has begun to take further action. Sens. Gary Peters (D-Mich.) and Rob Portman (R-Ohio) — the chair and ranking member, respectively, of the Senate Homeland Security and Governmental Affairs Committee — have moved forward legislation that would direct the Cybersecurity and Infrastructure Security Agency to study ways to mitigate risks in critical infrastructure that uses open-source software.
A trade-off
The transparency of open-source software means that malicious hackers can look at the source code to find new vulnerabilities, said Keith Lunden, manager of cyber physical threat analysis at cybersecurity firm Mandiant.
However, it’s a two-way street. Cybersecurity researchers have the same access, so they can identify and fix those vulnerabilities before malicious hackers have a chance to exploit them, Lunden said.
And unlike proprietary software, open-source software doesn’t have a shelf life. Vendors will eventually stop supporting a software product; the same isn’t true for open-source. For industrial systems that are designed to operate for decades, that longevity is key.
“With open-source software, the community has access to the source, and they can independently develop patches indefinitely, which can be an important factor for OT security,” Lunden said.
At least that’s the idea.
The flexibility of open source can mean that it’s constantly branching into new code: Individuals and companies may adapt it for their use, potentially creating new vulnerabilities.
Thomas Pace, co-founder of cybersecurity firm NetRise and a former DOE contractor in industrial control security, said he knows of a major telecommunications vendor that will take open-source software and rewrite portions of the code.
“That just then introduces a different set of problems, right? Because now you have to maintain your own code versus the whole community maintaining the code,” he said. “Is that better, is that worse? That’s a debate.”
An open-source bug can also mean widespread risk. In 2014, hackers took advantage of a massive vulnerability in an open-source encryption program called OpenSSL.
But the incident, called Heartbleed, was a single vulnerability. Once the bug is fixed, the onus is on vendors and owners to patch their system. If, instead, each software vendor created their own version of OpenSSL, there would be multiple vulnerabilities in each version.
“So it’s about a trade-off,” said Wright.
Lessons from the ‘worst’ vulnerability
The discovery of the Log4j vulnerability prompted the White House to hold an open-source software security summit last January. The meeting — which included top U.S. cyber experts, agency officials and open-source leaders like the Linux Foundation — aimed to improve federal and private collaboration so the software would be more secure.
In the months since, the Cybersecurity and Infrastructure Security Agency has promoted the use of a software bill of materials as a step to secure open-source software. CISA also plans to work with the open-source security community to identify commonly used code in critical infrastructure, in an effort to better understand where collaboration can take place.
But the agency highlighted that it can be a challenge to work with an open-source community when, by definition, it’s open to anyone. While there are some foundations that promote open-source development, software is often developed by small teams or single individuals.
In the meantime, CISA, the National Security Agency and Office of the Director of National Intelligence released best practices for open source developers to better secure their code.
As for the Log4j vulnerability, “significant risk remains,” according to a report this year from the Department of Homeland Security’s Cyber Safety Review Board.
The board, created by executive order in 2021, found that systems using the vulnerable Log4j version would be a major issue for “perhaps a decade or longer.”
The report concludes that the vulnerability did not lead to significant cyberattacks to critical infrastructure.
But NetRise’s Pace called that an “impossible statement,” and even the report notes that it’s not so cut-and-dried.
“While cybersecurity vendors were able to provide some anecdotal evidence of exploitation, no authoritative source exists to understand exploitation trends across geographies, industries, or ecosystems. Many organizations do not even collect information on specific Log4j exploitation, and reporting is still largely voluntary,” the board wrote in the report.
In short, organizations themselves sometimes aren’t aware that they have been targeted by malicious hackers. There is no list of where the Log4j software is installed.
The report also highlights the “security risks unique to the thinly-resourced, volunteer-based open source community.” It calls for centralized resources to help developers ensure their code is created to the latest security standards.
“Just as the software industry has enabled the democratization of software programming — the ability for anyone to generate software with little or no formal training — we must also democratize security by banking security by default into the platforms used to generate, build, deploy, and manage software at scale,” the report concludes.