Perspectives on the SolarWinds Incident

View or Download in PDF Format

Excerpt

A serious cybersecurity event was recently revealed: malicious actors had gained access to the source code for the SolarWinds Orion monitoring and management software. They inserted malware into that source code so that, when the software was distributed to and deployed by SolarWinds customers as part of an update, the malicious software could be used to surveil customers who unknowingly installed the malware and gain potentially arbitrary control over the systems managed by Orion. Of course, such a level of control has given attackers opportunities for further exploitation as well.

At the time of this writing, it is reported by SolarWinds that the update containing the malware was installed by thousands of customers, including numerous U.S. federal agencies and businesses around the world. The cybersecurity company FireEye was among the first reported to be actually compromised by the malware.

Numerous technical details on this incident appear online and in a variety of other publications as well as from SolarWinds itself.1 At the same time, there remains a great deal to say about how we might best respond to and recover from the incident as well as how we might avoid similar such events in the future.

The news of this event broke at a time that made it infeasible for IEEE Security & Privacy to write a full, detailed piece by the production deadline for this issue. However, due to the magnitude of the incident, the Editorial Board still wanted to address it in some way without waiting two months for the next issue. Therefore, this issue contains two pieces related to the SolarWinds incident: This first article contains brief perspectives from some members of the IEEE Security & Privacy Editorial Board, including numerous questions asked by Editorial Board members and also some suggested solutions. The second is a companion “Point–Counterpoint” column article by Fabio Massacci and Trent Jaeger that digs specifically into the quandaries and questions of software patching that relate to the SolarWinds incident. Additional details will undoubtedly continue to surface, and IEEE Security & Privacy expects to cover this occurrence further and in greater detail in future issues as we continue to learn about this compromise and its effects.—Sean Peisert

Editorial Board Members’ Perspectives

SolarWinds and Market Incentives

Bruce Schneier

The penetration of government and corporate networks worldwide is the result of inadequate cyberdefenses across the board. The lessons are many, but I want to focus on one important one we’ve learned: the software that’s managing our critical networks isn’t secure, and that’s because the market doesn’t reward that security.

SolarWinds is a perfect example. The company was the initial infection vector for much of the operation. Its trusted position inside so many critical networks made it a perfect target for a supply-chain attack, and its shoddy security practices made it an easy target.

Why did SolarWinds have such bad security? The answer is because it was more profitable. The company is owned by Thoma Bravo partners, a private-equity firm known for radical cost-cutting in the name of short-term profit. Under CEO Kevin Thompson, the company underspent on security even as it outsourced software development. The New York Times reports that the company’s cybersecurity advisor quit after his “basic recommendations were ignored.” In a very real sense, SolarWinds profited because it secretly shifted a whole bunch of risk to its customers: the U.S. government, IT companies, and others.

This problem isn’t new, and, while it’s exacerbated by the private-equity funding model, it’s not unique to it. In general, the market doesn’t reward safety and security—especially when the effects of ignoring those things are long term and diffuse. The market rewards short-term profits at the expense of safety and security. (Watch and see whether SolarWinds suffers any long-term effects from this hack, or whether Thoma Bravo’s bet that it could profit by selling an insecure product was a good one.)

The solution here is twofold. The first is to improve government software procurement. Software is now critical to national security. Any system of procuring that software needs to evaluate the security of the software and the security practices of the company, in detail, to ensure that they are sufficient to meet the security needs of the network they’re being installed in. If these evaluations are made public, along with the list of companies that meet them, all network buyers can benefit from them. It’s a win for everybody.

But that isn’t enough; we need a second part. The only way to force companies to provide safety and security features for customers is through regulation. This is true whether we want seatbelts in our cars, basic food safety at our restaurants, pajamas that don’t catch on fire, or home routers that aren’t vulnerable to cyberattack. The government needs to set minimum security standards for software that’s used in critical network applications, just as it sets software standards for avionics.

Without these two measures, it’s just too easy for companies to act like SolarWinds: save money by skimping on safety and security and hope for the best in the long term. That’s the rational thing for companies to do in an unregulated market, and the only way to change that is to change the economic incentives.

Categories: Computer and Information Security

Sidebar photo of Bruce Schneier by Joe MacInnis.