Entries Tagged "academic papers"

Page 70 of 80

Evaluating Risks of Low-Probability High-Cost Events

Probing the Improbable: Methodological Challenges for Risks with Low Probabilities and High Stakes,” by Toby Ord, Rafaela Hillerbrand, Anders Sandberg.

Abstract:

Some risks have extremely high stakes. For example, a worldwide pandemic or asteroid impact could potentially kill more than a billion people. Comfortingly, scientific calculations often put very low probabilities on the occurrence of such catastrophes. In this paper, we argue that there are important new methodological problems which arise when assessing global catastrophic risks and we focus on a problem regarding probability estimation. When an expert provides a calculation of the probability of an outcome, they are really providing the probability of the outcome occurring, given that their argument is watertight. However, their argument may fail for a number of reasons such as a flaw in the underlying theory, a flaw in the modeling of the problem, or a mistake in the calculations. If the probability estimate given by an argument is dwarfed by the chance that the argument itself is flawed, then the estimate is suspect. We develop this idea formally, explaining how it differs from the related distinctions of model and parameter uncertainty. Using the risk estimates from the Large Hadron Collider as a test case, we show how serious the problem can be when it comes to catastrophic risks and how best to address it.

Posted on February 2, 2009 at 1:26 PMView Comments

Who Worries About Terrorism?

The paper, “Terrorism-Related Fear and Avoidance Behavior in a Multiethnic Urban Population,” is for subscribers only.

Abstract

Objectives. We sought to determine whether groups traditionally most vulnerable to disasters would be more likely than would be others to perceive population-level risk as high (as measured by the estimated color-coded alert level) would worry more about terrorism, and would avoid activities because of terrorism concerns.

Methods. We conducted a random digit dial survey of the Los Angeles County population October 2004 through January 2005 in 6 languages. We asked respondents what color alert level the country was under, how often they worry about terrorist attacks, and how often they avoid activities because of terrorism. Multivariate regression modeled correlates of worry and avoidance, including mental illness, disability, demographic factors, and estimated color-coded alert level.

Results. Persons who are mentally ill, those who are disabled, African Americans, Latinos, Chinese Americans, Korean Americans, and non-US citizens were more likely to perceive population-level risk as high, as measured by the estimated color-coded alert level. These groups also reported more worry and avoidance behaviors because of concerns about terrorism.

Conclusions. Vulnerable populations experience a disproportionate burden of the psychosocial impact of terrorism threats and our national response. Further studies should investigate the specific behaviors affected and further elucidate disparities in the disaster burden associated with terrorism and terrorism policies.

This is certainly related. As people search for health-related information on the Internet, a common result of their newfound “knowledge” is more stress and anxiety, which can manifest itself in new symptoms.

Posted on December 9, 2008 at 12:58 PMView Comments

P = NP?

People have been sending me this paper that “proves” that P != NP. These sorts of papers make the rounds regularly, and my advice is to not pay attention to any of them. G.J. Woeginger keeps a list of these papers—he has 43 so far—and points out:

The following paragraphs list many papers that try to contribute to the P-versus-NP question. Among all these papers, there is only a single paper that has appeared in a peer-reviewed journal, that has thoroughly been verified by the experts in the area, and whose correctness is accepted by the general research community: The paper by Mihalis Yannakakis. (And this paper does not settle the P-versus-NP question, but “just” shows that a certain approach to settling this question will never work out.)

Of course, there’s a million-dollar prize for resolving the question—so expect the flawed proofs to continue.

Posted on November 4, 2008 at 12:12 PMView Comments

The Skein Hash Function

NIST is holding a competition to replace the SHA family of hash functions, which have been increasingly under attack. (I wrote about an early NIST hash workshop here.)

Skein is our submission (myself and seven others: Niels Ferguson, Stefan Lucks, Doug Whiting, Mihir Bellare, Tadayoshi Kohno, Jon Callas, and Jesse Walker). Here’s the paper:

Executive Summary

Skein is a new family of cryptographic hash functions. Its design combines speed, security, simplicity, and a great deal of flexibility in a modular package that is easy to analyze.

Skein is fast. Skein-512—our primary proposal—hashes data at 6.1 clock cycles per byte on a 64-bit CPU. This means that on a 3.1 GHz x64 Core 2 Duo CPU, Skein hashes data at 500 MBytes/second per core—almost twice as fast as SHA-512 and three times faster than SHA-256. An optional hash-tree mode speeds up parallelizable implementations even more. Skein is fast for short messages, too; Skein-512 hashes short messages in about 1000 clock cycles.

Skein is secure. Its conservative design is based on the Threefish block cipher. Our current best attack on Threefish-512 is on 25 of 72 rounds, for a safety factor of 2.9. For comparison, at a similar stage in the standardization process, the AES encryption algorithm had an attack on 6 of 10 rounds, for a safety factor of only 1.7. Additionally, Skein has a number of provably secure properties, greatly increasing confidence in the algorithm.

Skein is simple. Using only three primitive operations, the Skein compression function can be easily understood and remembered. The rest of the algorithm is a straightforward iteration of this function.

Skein is flexible. Skein is defined for three different internal state sizes—256 bits, 512 bits, and 1024 bits—and any output size. This allows Skein to be a drop-in replacement for the entire SHA family of hash functions. A completely optional and extendable argument system makes Skein an efficient tool to use for a very large number of functions: a PRNG, a stream cipher, a key derivation function, authentication without the overhead of HMAC, and a personalization capability. All these features can be implemented with very low overhead. Together with the Threefish large-block cipher at Skein core, this design provides a full set of symmetric cryptographic primitives suitable for most modern applications.

Skein is efficient on a variety of platforms, both hardware and software. Skein-512 can be implemented in about 200 bytes of state. Small devices, such as 8-bit smart cards, can implement Skein-256 using about 100 bytes of memory. Larger devices can implement the larger versions of Skein to achieve faster speeds.

Skein was designed by a team of highly experienced cryptographic experts from academia and industry, with expertise in cryptography, security analysis, software, chip design, and implementation of real-world cryptographic systems. This breadth of knowledge allowed them to create a balanced design that works well in all environments.

Here’s source code, text vectors, and the like for Skein. Watch the Skein website for any updates—new code, new results, new implementations, the proofs.

NIST’s deadline is Friday. It seems as if everyone—including many amateurs—is working on a hash function, and I predict that NIST will receive at least 80 submissions. (Compare this to the sixteen NIST submissions received for the AES competition in 1998.) I expect people to start posting their submissions over the weekend. (Ron Rivest already presented MD6 at Crypto in August.) Probably the best place to watch for new hash functions is here; I’ll try to keep a listing of the submissions myself.

The selection process will take around four years. I’ve previously called this sort of thing a cryptographic demolition derby—last one left standing wins—but that’s only half true. Certainly all the groups will spend the next couple of years trying to cryptanalyze each other, but in the end there will be a bunch of unbroken algorithms; NIST will select one based on performance and features.

NIST has stated that the goal of this process is not to choose the best standard but to choose a good standard. I think that’s smart of them; in this process, “best” is the enemy of “good.” My advice is this: immediately sort them based on performance and features. Ask the cryptographic community to focus its attention on the top dozen, rather than spread its attention across all 80—although I also expect that most of the amateur submissions will be rejected by NIST for not being “complete and proper.” Otherwise, people will break the easy ones and the better ones will go unanalyzed.

EDITED TO ADD (10/30): Here is a single website for all information, including cryptanalysis, of all the SHA-3 submissions. A spoke to a reporter who told me that, as of yesterday, NIST had received 30 submissions. And three news articles about Skein.

Posted on October 29, 2008 at 6:35 AMView Comments

Designing a Malicious Processor

From the LEET ’08 conference: “Designing and implementing malicious hardware,” by Samuel T. King, Joseph Tucek, Anthony Cozzie, Chris Grier, Weihang Jiang, and Yuanyuan Zhou.

Abstract:

Hidden malicious circuits provide an attacker with a stealthy attack vector. As they occupy a layer below the entire software stack, malicious circuits can bypass traditional defensive techniques. Yet current work on trojan circuits considers only simple attacks against the hardware itself, and straightforward defenses. More complex designs that attack the software are unexplored, as are the countermeasures an attacker may take to bypass proposed defenses.

We present the design and implementation of Illinois Malicious Processors (IMPs). There is a substantial design space in malicious circuitry; we show that an attacker,
rather than designing one specific attack, can instead design hardware to support attacks. Such flexible hardware allows powerful, general purpose attacks, while remaining surprisingly low in the amount of additional hardware. We show two such hardware designs, and implement them in a real system. Further, we show three powerful attacks using this hardware, including login backdoor that gives an attacker complete and highlevel access to the machine. This login attack requires only 1341 additional gates: gates that can be used for other attacks as well. Malicious processors are more practical, more flexible, and harder to detect than an initial analysis would suggest.

Posted on October 16, 2008 at 12:39 PMView Comments

Threat Modeling at Microsoft

Interesting paper by Adam Shostack:

Abstract. Describes a decade of experience threat modeling products and services at Microsoft. Describes the current threat modeling methodology used in the Security Development Lifecycle. The methodology is a practical approach, usable by non-experts, centered on data ow diagrams and a threat enumeration technique of ‘STRIDE per element.’ The paper covers some lessons learned which are likely applicable to other security analysis techniques. The paper closes with some possible questions for academic research.

Posted on October 13, 2008 at 6:21 AMView Comments

1 68 69 70 71 72 80

Sidebar photo of Bruce Schneier by Joe MacInnis.