Entries Tagged "security engineering"

Page 3 of 14

When Biology Becomes Software

All of life is based on the coordinated action of genetic parts (genes and their controlling sequences) found in the genomes (the complete DNA sequence) of organisms.

Genes and genomes are based on code — just like the digital language of computers. But instead of zeros and ones, four DNA letters — A, C, T, G — encode all of life. (Life is messy, and there are actually all sorts of edge cases, but ignore that for now.) If you have the sequence that encodes an organism, in theory, you could recreate it. If you can write new working code, you can alter an existing organism or create a novel one.

If this sounds to you a lot like software coding, you’re right. As synthetic biology looks more like computer technology, the risks of the latter become the risks of the former. Code is code, but because we’re dealing with molecules — and sometimes actual forms of life — the risks can be much greater.

Imagine a biological engineer trying to increase the expression of a gene that maintains normal gene function in blood cells. Even though it’s a relatively simple operation by today’s standards, it’ll almost certainly take multiple tries to get it right. Were this computer code, the only damage those failed tries would do is to crash the computer they’re running on. With a biological system, the code could instead increase the likelihood of multiple types of leukemias and wipe out cells important to the patient’s immune system.

We have known the mechanics of DNA for some 60-plus years. The field of modern biotechnology began in 1972 when Paul Berg joined one virus gene to another and produced the first “recombinant” virus. Synthetic biology arose in the early 2000s when biologists adopted the mindset of engineers; instead of moving single genes around, they designed complex genetic circuits.

In 2010, Craig Venter and his colleagues recreated the genome of a simple bacterium. More recently, researchers at the Medical Research Council Laboratory of Molecular Biology in Britain created a new, more streamlined version of E. coli. In both cases, the researchers created what could arguably be called new forms of life.

This is the new bioengineering, and it will only get more powerful. Today you can write DNA code in the same way a computer programmer writes computer code. Then you can use a DNA synthesizer or order DNA from a commercial vendor, and then use precision editing tools such as CRISPR to “run” it in an already existing organism, from a virus to a wheat plant to a person.

In the future, it may be possible to build an entire complex organism such as a dog or cat, or recreate an extinct mammoth (currently underway). Today, biotech companies are developing new gene therapies, and international consortia are addressing the feasibility and ethics of making changes to human genomes that could be passed down to succeeding generations.

Within the biological science community, urgent conversations are occurring about “cyberbiosecurity,” an admittedly contested term that exists between biological and information systems where vulnerabilities in one can affect the other. These can include the security of DNA databanks, the fidelity of transmission of those data, and information hazards associated with specific DNA sequences that could encode novel pathogens for which no cures exist.

These risks have occupied not only learned bodies — the National Academies of Sciences, Engineering, and Medicine published at least a half dozen reports on biosecurity risks and how to address them proactively — but have made it to mainstream media: genome editing was a major plot element in Netflix’s Season 3 of “Designated Survivor.”

Our worries are more prosaic. As synthetic biology “programming” reaches the complexity of traditional computer programming, the risks of computer systems will transfer to biological systems. The difference is that biological systems have the potential to cause much greater, and far more lasting, damage than computer systems.

Programmers write software through trial and error. Because computer systems are so complex and there is no real theory of software, programmers repeatedly test the code they write until it works properly. This makes sense, because both the cost of getting it wrong and the ease of trying again is so low. There are even jokes about this: a programmer would diagnose a car crash by putting another car in the same situation and seeing if it happened again.

Even finished code still has problems. Again due to the complexity of modern software systems, “works properly” doesn’t mean that it’s perfectly correct. Modern software is full of bugs — thousands of software flaws — that occasionally affect performance or security. That’s why any piece of software you use is regularly updated; the developers are still fixing bugs, even after the software is released.

Bioengineering will be largely the same: writing biological code will have these same reliability properties. Unfortunately, the software solution of making lots of mistakes and fixing them as you go doesn’t work in biology.

In nature, a similar type of trial and error is handled by “the survival of the fittest” and occurs slowly over many generations. But human-generated code from scratch doesn’t have that kind of correction mechanism. Inadvertent or intentional release of these newly coded “programs” may result in pathogens of expanded host range (just think swine flu) or organisms that wreck delicate ecological balances.

Unlike computer software, there’s no way so far to “patch” biological systems once released to the wild, although researchers are trying to develop one. Nor are there ways to “patch” the humans (or animals or crops) susceptible to such agents. Stringent biocontainment helps, but no containment system provides zero risk.

Opportunities for mischief and malfeasance often occur when expertise is siloed, fields intersect only at the margins, and when the gathered knowledge of small, expert groups doesn’t make its way into the larger body of practitioners who have important contributions to make.

Good starts have been made by biologists, security agencies, and governance experts. But these efforts have tended to be siloed, in either the biological and digital spheres of influence, classified and solely within the military, or exchanged only among a very small set of investigators.

What we need is more opportunities for integration between the two disciplines. We need to share information and experiences, classified and unclassified. We have tools among our digital and biological communities to identify and mitigate biological risks, and those to write and deploy secure computer systems.

Those opportunities will not occur without effort or financial support. Let’s find those resources, public, private, philanthropic, or any combination. And then let’s use those resources to set up some novel opportunities for digital geeks and bionerds — as well as ethicists and policy makers — to share experiences and concerns, and come up with creative, constructive solutions to these problems that are more than just patches.

These are overarching problems; let’s not let siloed thinking or funding get in the way of breaking down barriers between communities. And let’s not let technology of any kind get in the way of the public good.

This essay previously appeared on CNN.com.

EDITED TO ADD (9/23): Commentary.

Posted on September 13, 2019 at 11:40 AMView Comments

The Myth of Consumer-Grade Security

The Department of Justice wants access to encrypted consumer devices but promises not to infiltrate business products or affect critical infrastructure. Yet that’s not possible, because there is no longer any difference between those categories of devices. Consumer devices are critical infrastructure. They affect national security. And it would be foolish to weaken them, even at the request of law enforcement.

In his keynote address at the International Conference on Cybersecurity, Attorney General William Barr argued that companies should weaken encryption systems to gain access to consumer devices for criminal investigations. Barr repeated a common fallacy about a difference between military-grade encryption and consumer encryption: “After all, we are not talking about protecting the nation’s nuclear launch codes. Nor are we necessarily talking about the customized encryption used by large business enterprises to protect their operations. We are talking about consumer products and services such as messaging, smart phones, e-mail, and voice and data applications.”

The thing is, that distinction between military and consumer products largely doesn’t exist. All of those “consumer products” Barr wants access to are used by government officials — heads of state, legislators, judges, military commanders and everyone else — worldwide. They’re used by election officials, police at all levels, nuclear power plant operators, CEOs and human rights activists. They’re critical to national security as well as personal security.

This wasn’t true during much of the Cold War. Before the Internet revolution, military-grade electronics were different from consumer-grade. Military contracts drove innovation in many areas, and those sectors got the cool new stuff first. That started to change in the 1980s, when consumer electronics started to become the place where innovation happened. The military responded by creating a category of military hardware called COTS: commercial off-the-shelf technology. More consumer products became approved for military applications. Today, pretty much everything that doesn’t have to be hardened for battle is COTS and is the exact same product purchased by consumers. And a lot of battle-hardened technologies are the same computer hardware and software products as the commercial items, but in sturdier packaging.

Through the mid-1990s, there was a difference between military-grade encryption and consumer-grade encryption. Laws regulated encryption as a munition and limited what could legally be exported only to key lengths that were easily breakable. That changed with the rise of Internet commerce, because the needs of commercial applications more closely mirrored the needs of the military. Today, the predominant encryption algorithm for commercial applications — Advanced Encryption Standard (AES) — is approved by the National Security Agency (NSA) to secure information up to the level of Top Secret. The Department of Defense’s classified analogs of the Internet­ — Secret Internet Protocol Router Network (SIPRNet), Joint Worldwide Intelligence Communications System (JWICS) and probably others whose names aren’t yet public — use the same Internet protocols, software, and hardware that the rest of the world does, albeit with additional physical controls. And the NSA routinely assists in securing business and consumer systems, including helping Google defend itself from Chinese hackers in 2010.

Yes, there are some military applications that are different. The US nuclear system Barr mentions is one such example — and it uses ancient computers and 8-inch floppy drives. But for pretty much everything that doesn’t see active combat, it’s modern laptops, iPhones, the same Internet everyone else uses, and the same cloud services.

This is also true for corporate applications. Corporations rarely use customized encryption to protect their operations. They also use the same types of computers, networks, and cloud services that the government and consumers use. Customized security is both more expensive because it is unique, and less secure because it’s nonstandard and untested.

During the Cold War, the NSA had the dual mission of attacking Soviet computers and communications systems and defending domestic counterparts. It was possible to do both simultaneously only because the two systems were different at every level. Today, the entire world uses Internet protocols; iPhones and Android phones; and iMessage, WhatsApp and Signal to secure their chats. Consumer-grade encryption is the same as military-grade encryption, and consumer security is the same as national security.

Barr can’t weaken consumer systems without also weakening commercial, government, and military systems. There’s one world, one network, and one answer. As a matter of policy, the nation has to decide which takes precedence: offense or defense. If security is deliberately weakened, it will be weakened for everybody. And if security is strengthened, it is strengthened for everybody. It’s time to accept the fact that these systems are too critical to society to weaken. Everyone will be more secure with stronger encryption, even if it means the bad guys get to use that encryption as well.

This essay previously appeared on Lawfare.com.

Posted on August 28, 2019 at 6:14 AMView Comments

Supply-Chain Attack against the Electron Development Platform

Electron is a cross-platform development system for many popular communications apps, including Skype, Slack, and WhatsApp. Security vulnerabilities in the update system allows someone to silently inject malicious code into applications. From a news article:

At the BSides LV security conference on Tuesday, Pavel Tsakalidis demonstrated a tool he created called BEEMKA, a Python-based tool that allows someone to unpack Electron ASAR archive files and inject new code into Electron’s JavaScript libraries and built-in Chrome browser extensions. The vulnerability is not part of the applications themselves but of the underlying Electron framework — ­and that vulnerability allows malicious activities to be hidden within processes that appear to be benign. Tsakalidis said that he had contacted Electron about the vulnerability but that he had gotten no response — ­and the vulnerability remains.

While making these changes required administrator access on Linux and MacOS, it only requires local access on Windows. Those modifications can create new event-based “features” that can access the file system, activate a Web cam, and exfiltrate information from systems using the functionality of trusted applications­ — including user credentials and sensitive data. In his demonstration, Tsakalidis showed a backdoored version of Microsoft Visual Studio Code that sent the contents of every code tab opened to a remote website.

Basically, the Electron ASAR files aren’t signed or encrypted, so modifying them is easy.

Note that this attack requires local access to the computer, which means that an attacker that could do this could do much more damaging things as well. But once an app has been modified, it can be distributed to other users. It’s not a big deal attack, but it’s a vulnerability that should be closed.

Posted on August 8, 2019 at 11:11 AMView Comments

Wanted: Cybersecurity Imagery

Eli Sugarman of the Hewlettt Foundation laments about the sorry state of cybersecurity imagery:

The state of cybersecurity imagery is, in a word, abysmal. A simple Google Image search for the term proves the point: It’s all white men in hoodies hovering menacingly over keyboards, green “Matrix”-style 1s and 0s, glowing locks and server racks, or some random combination of those elements — sometimes the hoodie-clad men even wear burglar masks. Each of these images fails to convey anything about either the importance or the complexity of the topic­ — or the huge stakes for governments, industry and ordinary people alike inherent in topics like encryption, surveillance and cyber conflict.

I agree that this is a problem. It’s not something I noticed until recently. I work in words. I think in words. I don’t use PowerPoint (or anything similar) when I give presentations. I don’t need visuals.

But recently, I started teaching at the Harvard Kennedy School, and I constantly use visuals in my class. I made those same image searches, and I came up with similarly unacceptable results.

But unlike me, Hewlett is doing something about it. You can help: participate in the Cybersecurity Visuals Challenge.

EDITED TO ADD (8/5): News article. Slashdot thread.

Posted on July 29, 2019 at 6:15 AMView Comments

Insider Logic Bombs

Add to the “not very smart criminals” file:

According to court documents, Tinley provided software services for Siemens’ Monroeville, PA offices for nearly ten years. Among the work he was asked to perform was the creation of spreadsheets that the company was using to manage equipment orders.

The spreadsheets included custom scripts that would update the content of the file based on current orders stored in other, remote documents, allowing the company to automate inventory and order management.

But while Tinley’s files worked for years, they started malfunctioning around 2014. According to court documents, Tinley planted so-called “logic bombs” that would trigger after a certain date, and crash the files.

Every time the scripts would crash, Siemens would call Tinley, who’d fix the files for a fee.

Posted on July 26, 2019 at 6:05 AMView Comments

Software Developers and Security

According to a survey: “68% of the security professionals surveyed believe it’s a programmer’s job to write secure code, but they also think less than half of developers can spot security holes.” And that’s a problem.

Nearly half of security pros surveyed, 49%, said they struggle to get developers to make remediation of vulnerabilities a priority. Worse still, 68% of security professionals feel fewer than half of developers can spot security vulnerabilities later in the life cycle. Roughly half of security professionals said they most often found bugs after code is merged in a test environment.

At the same time, nearly 70% of developers said that while they are expected to write secure code, they get little guidance or help. One disgruntled programmer said, “It’s a mess, no standardization, most of my work has never had a security scan.”

Another problem is it seems many companies don’t take security seriously enough. Nearly 44% of those surveyed reported that they’re not judged on their security vulnerabilities.

Posted on July 25, 2019 at 6:17 AMView Comments

Yubico Security Keys with a Crypto Flaw

Wow, is this an embarrassing bug:

Yubico is recalling a line of security keys used by the U.S. government due to a firmware flaw. The company issued a security advisory today that warned of an issue in YubiKey FIPS Series devices with firmware versions 4.4.2 and 4.4.4 that reduced the randomness of the cryptographic keys it generates. The security keys are used by thousands of federal employees on a daily basis, letting them securely log-on to their devices by issuing one-time passwords.

The problem in question occurs after the security key powers up. According to Yubico, a bug keeps “some predictable content” inside the device’s data buffer that could impact the randomness of the keys generated. Security keys with ECDSA signatures are in particular danger. A total of 80 of the 256 bits generated by the key remain static, meaning an attacker who gains access to several signatures could recreate the private key.

Boing Boing post.

EDITED TO ADD (6/12): From Microsoft TechNet Security Guidance blog (in 2014): “Why We’re Not Recommending ‘FIPS Mode’ Anymore.

Posted on July 1, 2019 at 5:55 AMView Comments

Hacking Hardware Security Modules

Security researchers Gabriel Campana and Jean-Baptiste Bédrune are giving a hardware security module (HSM) talk at BlackHat in August:

This highly technical presentation targets an HSM manufactured by a vendor whose solutions are usually found in major banks and large cloud service providers. It will demonstrate several attack paths, some of them allowing unauthenticated attackers to take full control of the HSM. The presented attacks allow retrieving all HSM secrets remotely, including cryptographic keys and administrator credentials. Finally, we exploit a cryptographic bug in the firmware signature verification to upload a modified firmware to the HSM. This firmware includes a persistent backdoor that survives a firmware update.

They have an academic paper in French, and a presentation of the work. Here’s a summary in English.

There were plenty of technical challenges to solve along the way, in what was clearly a thorough and professional piece of vulnerability research:

  1. They started by using legitimate SDK access to their test HSM to upload a firmware module that would give them a shell inside the HSM. Note that this SDK access was used to discover the attacks, but is not necessary to exploit them.
  2. They then used the shell to run a fuzzer on the internal implementation of PKCS#11 commands to find reliable, exploitable buffer overflows.
  3. They checked they could exploit these buffer overflows from outside the HSM, i.e. by just calling the PKCS#11 driver from the host machine
  4. They then wrote a payload that would override access control and, via another issue in the HSM, allow them to upload arbitrary (unsigned) firmware. It’s important to note that this backdoor is persistent ­ a subsequent update will not fix it.
  5. They then wrote a module that would dump all the HSM secrets, and uploaded it to the HSM.

Posted on June 20, 2019 at 6:56 AMView Comments

Chinese Military Wants to Develop Custom OS

Citing security concerns, the Chinese military wants to replace Windows with its own custom operating system:

Thanks to the Snowden, Shadow Brokers, and Vault7 leaks, Beijing officials are well aware of the US’ hefty arsenal of hacking tools, available for anything from smart TVs to Linux servers, and from routers to common desktop operating systems, such as Windows and Mac.

Since these leaks have revealed that the US can hack into almost anything, the Chinese government’s plan is to adopt a “security by obscurity” approach and run a custom operating system that will make it harder for foreign threat actors — mainly the US — to spy on Chinese military operations.

It’s unclear exactly how custom this new OS will be. It could be a Linux variant, like North Korea’s Red Star OS. Or it could be something completely new. Normally, I would be highly skeptical of a country being able to write and field its own custom operating system, but China is one of the few that is large enough to actually be able to do it. So I’m just moderately skeptical.

EDITED TO ADD (6/12): Russia also wants to develop its own flavor of Linux.

Posted on June 6, 2019 at 7:04 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.