Entries Tagged "security conferences"

Page 2 of 11

Security and Human Behavior (SHB 2016)

Earlier this week, I was at the ninth Workshop on Security and Human Behavior, hosted at Harvard University.

SHB is a small invitational gathering of people studying various aspects of the human side of security. The fifty or so people in the room include psychologists, economists, computer security researchers, sociologists, political scientists, philosophers, political scientists, neuroscientists, lawyers, anthropologists, business school professors, and a smattering of others. It’s not just an interdisciplinary event; most of the people here are individually interdisciplinary.

These are the most intellectually stimulating two days of my year; this year someone called it “Bruce’s brain in conference form.”

The goal is maximum interaction and discussion. We do that by putting everyone on panels. There are eight six-person panels over the course of the two days. Everyone gets to talk for ten minutes about their work, and then there’s half an hour of discussion in the room. Then there are lunches, dinners, and receptions — all designed so people meet each other and talk.

This page lists the participants and gives links to some of their work. As usual, Ross Anderson liveblogged the talks.

Here are my posts on the first, second, third, fourth, fifth, sixth, seventh, and eighth SHB workshops. Follow those links to find summaries, papers, and audio recordings of the workshops.

Posted on June 3, 2016 at 1:36 PMView Comments

Evidence Shows Data Breaches Not Increasing

This is both interesting and counterintuitive:

Our results suggest that publicly reported data breaches in the U.S. have not increased significantly over the past ten years, either in frequency or in size. Because the distribution of breach sizes is heavy-tailed, large (rare) events occur more frequently than intuition would suggest. This helps to explain why many reports show massive year-to-year increases in both the aggregate number of records exposed and the number of breaches. All of these reports lump data into yearly bins, and this amount of aggregation can often influence the apparent trends (Figure 1).

The idea that breaches are not necessarily worsening may seem counter-intuitive. The Red Queen hypothesis in biology provides a possible explanation. It states that organisms not only compete within their own species to gain reproductive advantage, but they must also compete with other species, leading to an evolutionary arms race. In our case, as security practices have improved, attacks have become more sophisticated, possibly resulting in stasis for both attackers or defenders. This hypothesis is consistent with observed patterns in the dataset. Indeed, for breaches over 500,000 records there was no increase in size or frequency of malicious data breaches, suggesting that for large breaches such an arms race could be occurring. Many large breaches have occurred over the past decade, but the largest was disclosed as far back as 2009, and the second largest was even earlier, in 2007. Future work could analyze these breaches in depth to determine whether more recent breaches have required more sophisticated attacks.

The research was presented at WEIS this week. According to their research, data breach frequency has a negative binomial distribution, and breach size has a log-normally distribution.

Posted on July 1, 2015 at 10:03 AMView Comments

What is the DoD's Position on Backdoors in Security Systems?

In May, Admiral James A. Winnefeld, Jr., vice-chairman of the Joint Chiefs of Staff, gave an address at the Joint Service Academies Cyber Security Summit at West Point. After he spoke for twenty minutes on the importance of Internet security and a good national defense, I was able to ask him a question (32:42 mark) about security versus surveillance:

Bruce Schneier: I’d like to hear you talk about this need to get beyond signatures and the more robust cyber defense and ask the industry to provide these technologies to make the infrastructure more secure. My question is, the only definition of “us” that makes sense is the world, is everybody. Any technologies that we’ve developed and built will be used by everyone — nation-state and non-nation-state. So anything we do to increase our resilience, infrastructure, and security will naturally make Admiral Rogers’s both intelligence and attack jobs much harder. Are you okay with that?

Admiral James A. Winnefeld: Yes. I think Mike’s okay with that, also. That’s a really, really good question. We call that IGL. Anyone know what IGL stands for? Intel gain-loss. And there’s this constant tension between the operational community and the intelligence community when a military action could cause the loss of a critical intelligence node. We live this every day. In fact, in ancient times, when we were collecting actual signals in the air, we would be on the operational side, “I want to take down that emitter so it’ll make it safer for my airplanes to penetrate the airspace,” and they’re saying, “No, you’ve got to keep that emitter up, because I’m getting all kinds of intelligence from it.” So this is a familiar problem. But I think we all win if our networks are more secure. And I think I would rather live on the side of secure networks and a harder problem for Mike on the intelligence side than very vulnerable networks and an easy problem for Mike. And part of that — it’s not only the right thing do, but part of that goes to the fact that we are more vulnerable than any other country in the world, on our dependence on cyber. I’m also very confident that Mike has some very clever people working for him. He might actually still be able to get some work done. But it’s an excellent question. It really is.

It’s a good answer, and one firmly on the side of not introducing security vulnerabilities, backdoors, key-escrow systems, or anything that weakens Internet systems. It speaks to what I have seen as a split in the Second Crypto War, between the NSA and the FBI on building secure systems versus building systems with surveillance capabilities.

I have written about this before:

But here’s the problem: technological capabilities cannot distinguish based on morality, nationality, or legality; if the US government is able to use a backdoor in a communications system to spy on its enemies, the Chinese government can use the same backdoor to spy on its dissidents.

Even worse, modern computer technology is inherently democratizing. Today’s NSA secrets become tomorrow’s PhD theses and the next day’s hacker tools. As long as we’re all using the same computers, phones, social networking platforms, and computer networks, a vulnerability that allows us to spy also allows us to be spied upon.

We can’t choose a world where the US gets to spy but China doesn’t, or even a world where governments get to spy and criminals don’t. We need to choose, as a matter of policy, communications systems that are secure for all users, or ones that are vulnerable to all attackers. It’s security or surveillance.

NSA Director Admiral Mike Rogers was in the audience (he spoke earlier), and I saw him nodding at Winnefeld’s answer. Two weeks later, at CyCon in Tallinn, Rogers gave the opening keynote, and he seemed to be saying the opposite.

“Can we create some mechanism where within this legal framework there’s a means to access information that directly relates to the security of our respective nations, even as at the same time we are mindful we have got to protect the rights of our individual citizens?”

[…]

Rogers said a framework to allow law enforcement agencies to gain access to communications is in place within the phone system in the United States and other areas, so “why can’t we create a similar kind of framework within the internet and the digital age?”

He added: “I certainly have great respect for those that would argue that they most important thing is to ensure the privacy of our citizens and we shouldn’t allow any means for the government to access information. I would argue that’s not in the nation’s best long term interest, that we’ve got to create some structure that should enable us to do that mindful that it has to be done in a legal way and mindful that it shouldn’t be something arbitrary.”

Does Winnefeld know that Rogers is contradicting him? Can someone ask JCS about this?

Posted on June 24, 2015 at 7:42 AMView Comments

Security and Human Behavior (SHB 2015)

Earlier this week, I was at the eighth Workshop on Security and Human Behavior.

This is a small invitational gathering of people studying various aspects of the human side of security. The fifty people in the room include psychologists, computer security researchers, sociologists, behavioral economists, philosophers, political scientists, lawyers, biologists, anthropologists, business school professors, neuroscientists, and a smattering of others. It’s not just an interdisciplinary event; most of the people here are individually interdisciplinary.

I call this the most intellectually stimulating two days of my year. The goal is discussion amongst the group. We do that by putting everyone on panels, but only letting each person talk for 10 minutes. The rest of the 90-minute panel is left for discussion.

Ross Anderson liveblogged the talks. Bob Sullivan wrote a piece on some of the presentations on family surveillance.

Here are my posts on the first, second, third, fourth, fifth, sixth, and seventh SHB workshops. Follow those links to find summaries, papers, and audio recordings of the workshops.

Posted on June 11, 2015 at 1:24 PMView Comments

Security and Human Behavior (SHB 2014)

I’m at SHB 2014: the Seventh Annual Interdisciplinary Workshop on Security and Human Behavior. This is a small invitational gathering of people studying various aspects of the human side of security. The fifty people in the room include psychologists, computer security researchers, sociologists, behavioral economists, philosophers, political scientists, lawyers, anthropologists, business school professors, neuroscientists, and a smattering of others. It’s not just an interdisciplinary event; most of the people here are individually interdisciplinary.

I call this the most intellectually stimulating two days of my years. The goal is discussion amongst the group. We do that by putting everyone on panels, but only letting each person talk for 5-7 minutes The rest of the 90-minute panel is left for discussion.

The conference is organized by Alessandro Acquisti, Ross Anderson, and me. This year we’re at Cambridge University, in the UK.

The conference website contains a schedule and a list of participants, which includes links to writings by each of them. Ross Anderson is liveblogging the event. It’s also being recorded; I’ll post the link when it goes live.

Here are my posts on the first, second, third, fourth, fifth, and sixth SHB workshops. Follow those links to find summaries, papers, and audio recordings of the workshops. It’s hard to believe we’ve been doing this for seven years.

Posted on June 9, 2014 at 4:50 AMView Comments

The Cryptopocalypse

There was a presentation at Black Hat last month warning us of a “factoring cryptopocalypse”: a moment when factoring numbers and solving the discrete log problem become easy, and both RSA and DH break. This presentation was provocative, and has generated a lot of commentary, but I don’t see any reason to worry.

Yes, breaking modern public-key cryptosystems has gotten easier over the years. This has been true for a few decades now. Back in 1999, I wrote this about factoring:

Factoring has been getting easier. It’s been getting easier faster than anyone has anticipated. I see four reasons why this is so:

  • Computers are getting faster.
  • Computers are better networked.
  • The factoring algorithms are getting more efficient.
  • Fundamental advances in mathematics are giving us better factoring algorithms.

I could have said the same thing about the discrete log problem. And, in fact, advances in solving one problem tend to mirror advances in solving the other.

The reasons are arrayed in order of unpredictability. The first two — advances in computing and networking speed — basically follow Moore’s Law (and others), year after year. The third comes in regularly, but in fits and starts: a 2x improvement here, a 10x improvement there. It’s the fourth that’s the big worry. Fundamental mathematical advances only come once in a while, but when they do come, the effects can be huge. If factoring ever becomes “easy” such that RSA is no longer a viable cryptographic algorithm, it will be because of this sort of advance.

The authors base their current warning on some recent fundamental advances in solving the discrete log problem, but the work doesn’t generalize to the types of numbers used for cryptography. And they’re not going to generalize; the result is simply specialized.

This isn’t to say that solving these problems won’t continue to get easier, but so far it has been trivially easy to increase key lengths to stay ahead of the advances. I expect this to remain true for the foreseeable future.

Posted on August 19, 2013 at 6:47 AMView Comments

Security and Human Behavior (SHB 2013)

I’m at the Sixth Interdisciplinary Workshop on Security and Human Behavior (SHB 2013). This year we’re in Los Angeles, at USC — hosted by CREATE.

My description from last year still applies:

SHB is an invitational gathering of psychologists, computer security researchers, behavioral economists, sociologists, law professors, business school professors, political scientists, anthropologists, philosophers, and others — all of whom are studying the human side of security — organized by Alessandro Acquisti, Ross Anderson, and me. It’s not just an interdisciplinary event; most of the people here are individually interdisciplinary.

It is still the most intellectually stimulating conference I attend all year. The format has remained unchanged since the beginning. Each panel consists of six people. Everyone has ten minutes to talk, and then we have half an hour of questions and discussion. The format maximizes interaction, which is really important in an interdisciplinary conference like this one.

The conference website contains a schedule and a list of participants, which includes links to writings by each of them. Both Ross Anderson and Vaibhav Garg have liveblogged the event.

Here are my posts on the first, second, third, fourth, and fifth SHB workshops. Follow those links to find summaries, papers, and audio recordings of the workshops.

Posted on June 5, 2013 at 7:20 AMView Comments

Marketing at the RSA Conference

Marcus Ranum has an interesting screed on “booth babes” in the RSA Conference exhibition hall:

I’m not making a moral argument about sexism in our industry or the objectification of women. I could (and probably should) but it’s easier to just point out the obvious: the only customers that will be impressed by anyone’s ability to hire pretty models to work their booth aren’t going to be the ones signing the big purchase orders. And, it’s possible that they’re thinking your sales team are going to be a bunch of testosterone-laden assholes who’d be better off selling used tires. If some company wants to appeal to the consumer that’s going to jump at the T&A maybe they should relocate up the street to O’Farrell where they can include a happy ending with their product demo.

Mark Rothman on the same topic.

EDITED TO ADD (3/11): Winn Schwartau makes a similar point.

Posted on March 5, 2013 at 1:58 PMView Comments

Me at the RSA Conference

I’ll be speaking twice at the RSA Conference this year. I’m giving a solo talk Tuesday at 1:00, and participating in a debate about training Wednesday at noon. This is a short written preview of my solo talk, and this is an audio interview on the topic.

Additionally: Akamai is giving away 1,500 copies of Liars and Outliers, and Zcaler is giving away 300 copies of Schneier on Security. I’ll be doing book signings in both of those companies’ booths and at the conference bookstore.

Posted on February 25, 2013 at 1:49 PMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.