Entries Tagged "NIST"

Page 2 of 6

NIST Issues Call for "Lightweight Cryptography" Algorithms

This is interesting:

Creating these defenses is the goal of NIST’s lightweight cryptography initiative, which aims to develop cryptographic algorithm standards that can work within the confines of a simple electronic device. Many of the sensors, actuators and other micromachines that will function as eyes, ears and hands in IoT networks will work on scant electrical power and use circuitry far more limited than the chips found in even the simplest cell phone. Similar small electronics exist in the keyless entry fobs to newer-model cars and the Radio Frequency Identification (RFID) tags used to locate boxes in vast warehouses.

All of these gadgets are inexpensive to make and will fit nearly anywhere, but common encryption methods may demand more electronic resources than they possess.

The NSA’s SIMON and SPECK would certainly qualify.

Posted on May 2, 2018 at 6:40 AMView Comments

Changes in Password Best Practices

NIST recently published its four-volume SP800-63b Digital Identity Guidelines. Among other things, it makes three important suggestions when it comes to passwords:

  1. Stop it with the annoying password complexity rules. They make passwords harder to remember. They increase errors because artificially complex passwords are harder to type in. And they don’t help that much. It’s better to allow people to use pass phrases.
  2. Stop it with password expiration. That was an old idea for an old way we used computers. Today, don’t make people change their passwords unless there’s indication of compromise.
  3. Let people use password managers. This is how we deal with all the passwords we need.

These password rules were failed attempts to fix the user. Better we fix the security systems.

Posted on October 10, 2017 at 6:19 AMView Comments

The Cost of Cyberattacks Is Less than You Might Think

Interesting research from Sasha Romanosky at RAND:

Abstract: In 2013, the US President signed an executive order designed to help secure the nation’s critical infrastructure from cyberattacks. As part of that order, he directed the National Institute for Standards and Technology (NIST) to develop a framework that would become an authoritative source for information security best practices. Because adoption of the framework is voluntary, it faces the challenge of incentivizing firms to follow along. Will frameworks such as that proposed by NIST really induce firms to adopt better security controls? And if not, why? This research seeks to examine the composition and costs of cyber events, and attempts to address whether or not there exist incentives for firms to improve their security practices and reduce the risk of attack. Specifically, we examine a sample of over 12 000 cyber events that include data breaches, security incidents, privacy violations, and phishing crimes. First, we analyze the characteristics of these breaches (such as causes and types of information compromised). We then examine the breach and litigation rate, by industry, and identify the industries that incur the greatest costs from cyber events. We then compare these costs to bad debts and fraud within other industries. The findings suggest that public concerns regarding the increasing rates of breaches and legal actions may be excessive compared to the relatively modest financial impact to firms that suffer these events. Public concerns regarding the increasing rates of breaches and legal actions, conflict, however, with our findings that show a much smaller financial impact to firms that suffer these events. Specifically, we find that the cost of a typical cyber incident in our sample is less than $200 000 (about the same as the firm’s annual IT security budget), and that this represents only 0.4% of their estimated annual revenues.

The result is that it often makes business sense to underspend on cybersecurity and just pay the costs of breaches:

Romanosky analyzed 12,000 incident reports and found that typically they only account for 0.4 per cent of a company’s annual revenues. That compares to billing fraud, which averages at 5 per cent, or retail shrinkage (ie, shoplifting and insider theft), which accounts for 1.3 per cent of revenues.

As for reputational damage, Romanosky found that it was almost impossible to quantify. He spoke to many executives and none of them could give a reliable metric for how to measure the PR cost of a public failure of IT security systems.

He also noted that the effects of a data incident typically don’t have many ramifications on the stock price of a company in the long term. Under the circumstances, it doesn’t make a lot of sense to invest too much in cyber security.

What’s being left out of these costs are the externalities. Yes, the costs to a company of a cyberattack are low to them, but there are often substantial additional costs borne by other people. The way to look at this is not to conclude that cybersecurity isn’t really a problem, but instead that there is a significant market failure that governments need to address.

Posted on September 29, 2016 at 6:51 AMView Comments

NIST Starts Planning for Post-Quantum Cryptography

Last year, the NSA announced its plans for transitioning to cryptography that is resistant to a quantum computer. Now, it’s NIST’s turn. Its just-released report talks about the importance of algorithm agility and quantum resistance. Sometime soon, it’s going to have a competition for quantum-resistant public-key algorithms:

Creating those newer, safer algorithms is the longer-term goal, Moody says. A key part of this effort will be an open collaboration with the public, which will be invited to devise and vet cryptographic methods that—to the best of experts’ knowledge—­will be resistant to quantum attack. NIST plans to launch this collaboration formally sometime in the next few months, but in general, Moody says it will resemble past competitions such as the one for developing the SHA-3 hash algorithm, used in part for authenticating digital messages.

“It will be a long process involving public vetting of quantum-resistant algorithms,” Moody said. “And we’re not expecting to have just one winner. There are several systems in use that could be broken by a quantum computer­—public-key encryption and digital signatures, to take two examples­—and we will need different solutions for each of those systems.”

The report rightly states that we’re okay in the symmetric cryptography world; the key lengths are long enough.

This is an excellent development. NIST has done an excellent job with their previous cryptographic standards, giving us a couple of good, strong, well-reviewed, and patent-free algorithms. I have no doubt this process will be equally excellent. (If NIST is keeping a list, aside from post-quantum public-key algorithms, I would like to see competitions for a larger-block-size block cipher and a super-fast stream cipher as well.)

Two news articles.

Posted on May 9, 2016 at 6:19 AMView Comments

New NIST Encryption Guidelines

NIST has published a draft of their new standard for encryption use: “NIST Special Publication 800-175B, Guideline for Using Cryptographic Standards in the Federal Government: Cryptographic Mechanisms.” In it, the Escrowed Encryption Standard from the 1990s, FIPS-185, is no longer certified. And Skipjack, NSA’s symmetric algorithm from the same period, will no longer be certified.

I see nothing sinister about decertifying Skipjack. In a world of faster computers and post-quantum thinking, an 80-bit key and 64-bit block no longer cut it.

ETA: My essays from 1998 on Skipjack and KEA.

Posted on March 17, 2016 at 9:54 AMView Comments

Back Door in Juniper Firewalls

Juniper has warned about a malicious back door in its firewalls that automatically decrypts VPN traffic. It’s been there for years.

Hopefully details are forthcoming, but the folks at Hacker News have pointed to this page about Juniper’s use of the DUAL_EC_DBRG random number generator. For those who don’t immediately recognize that name, it’s the pseudo-random-number generator that was backdoored by the NSA. Basically, the PRNG uses two secret parameters to create a public parameter, and anyone who knows those secret parameters can predict the output. In the standard, the NSA chose those parameters. Juniper doesn’t use those tainted parameters. Instead:

ScreenOS does make use of the Dual_EC_DRBG standard, but is designed to not use Dual_EC_DRBG as its primary random number generator. ScreenOS uses it in a way that should not be vulnerable to the possible issue that has been brought to light. Instead of using the NIST recommended curve points it uses self-generated basis points and then takes the output as an input to FIPS/ANSI X.9.31 PRNG, which is the random number generator used in ScreenOS cryptographic operations.

This means that all anyone has to do to break the PRNG is to hack into the firewall and copy or modify those “self-generated basis points.”

Here’s a good summary of what we know. The conclusion:

Again, assuming this hypothesis is correct then, if it wasn’t the NSA who did this, we have a case where a US government backdoor effort (Dual-EC) laid the groundwork for someone else to attack US interests. Certainly this attack would be a lot easier given the presence of a backdoor-friendly RNG already in place. And I’ve not even discussed the SSH backdoor which, as Wired notes, could have been the work of a different group entirely. That backdoor certainly isn’t NOBUS—Fox-IT claim to have found the backdoor password in six hours.

More details to come, I’m sure.

EDITED TO ADD (12/21): A technical overview of the SSH backdoor.

EDITED TO ADD (12/22): Matthew Green wrote a really good technical post about this.

They then piggybacked on top of it to build a backdoor of their own, something they were able to do because all of the hard work had already been done for them. The end result was a period in which someone—maybe a foreign government—was able to decrypt Juniper traffic in the U.S. and around the world. And all because Juniper had already paved the road.

Another good article.

Posted on December 21, 2015 at 6:52 AMView Comments

How NIST Develops Cryptographic Standards

This document gives a good overview of how NIST develops cryptographic standards and guidelines. It’s still in draft, and comments are appreciated.

Given that NIST has been tainted by the NSA’s actions to subvert cryptographic standards and protocols, more transparency in this process is appreciated. I think NIST is doing a fine job and that it’s not shilling for the NSA, but it needs to do more to convince the world of that.

Posted on March 4, 2014 at 6:41 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.