Page 483

Massive Police Shootout in Cleveland Despite Lack of Criminals

This is an amazing story. I urge you to read the whole thing, but here’s the basics:

A November car chase ended in a “full blown-out” firefight, with glass and bullets flying, according to Cleveland police officers who described for investigators the chaotic scene at the end of the deadly 25-minute pursuit.

But when the smoky haze—caused by rapid fire of nearly 140 bullets in less than 30 seconds—dissipated, it soon became clear that more than a dozen officers had been firing at one another across a middle school parking lot in East Cleveland.

At the end of the scene, both unarmed—and presumably innocent—people in the car were dead.

There’s a lot that can be said here, but I don’t feel qualified to say it. There’s a whole body of research on decision making under stress—police, firefighters, soldiers—and how easy it is to get caught up in the heat of the moment. I have read one book on that subject, Sources of Power, but that was years ago.

What interests me right now is how this whole situation was colored by what “society” is talking about and afraid of, which became the preconceptions the officers brought to the event. School shootings are in the news, so as soon as the car drove into a school parking lot, the police assumed the worst. Firefights with dangerous criminals are what we see on TV, so that’s not unexpected, either. When you read the story, it’s clear how many of the elements that the officers believed—police cars being rammed, for example—are right out of television violence. This would have turned out very differently if the officers had assumed that, as is almost always true, the two people in the car were just two people in a car.

I’m also curious as to how much technology contributed to this. Reports on the radio brought more and more officers to the scene, and misinformation was broadcast over the radio.

Again, I’m not really qualified to write about any of this. But it’s what I’ve been thinking about.

Posted on February 12, 2013 at 12:55 PMView Comments

Our New Regimes of Trust

Society runs on trust. Over the millennia, we’ve developed a variety of mechanisms to induce trustworthy behavior in society. These range from a sense of guilt when we cheat, to societal disapproval when we lie, to laws that arrest fraudsters, to door locks and burglar alarms that keep thieves out of our homes. They’re complicated and interrelated, but they tend to keep society humming along.

The information age is transforming our society. We’re shifting from evolved social systems to deliberately created socio-technical systems. Instead of having conversations in offices, we use Facebook. Instead of meeting friends, we IM. We shop online. We let various companies and governments collect comprehensive dossiers on our movements, our friendships, and our interests. We let others censor what we see and read. I could go on for pages.

None of this is news to anyone. But what’s important, and much harder to predict, are the social changes resulting from these technological changes. With the rapid proliferation of computers—both fixed and mobile—computing devices and in-the-cloud processing, new ways of socialization have emerged. Facebook friends are fundamentally different than in-person friends. IM conversations are fundamentally different than voice conversations. Twitter has no pre-Internet analog. More social changes are coming. These social changes affect trust, and trust affects everything.

This isn’t just academic. There has always been a balance in society between the honest and the dishonest, and technology continually upsets that balance. Online banking results in new types of cyberfraud. Facebook posts become evidence in employment and legal disputes. Cell phone location tracking can be used to round up political dissidents. Random blogs and websites become trusted sources, abetting propaganda. Crime has changed: easier impersonation, action at a greater distance, automation, and so on. The more our nation’s infrastructure relies on cyberspace, the more vulnerable we are to cyberattack.

Think of this as a “security gap”: the time lag between when the bad guys figure out how to exploit a new technology and when the good guys figure out how to restore society’s balance.

Critically, the security gap is larger when there’s more technology, and especially in times of rapid technological change. More importantly, it’s larger in times of rapid social change due to the increased use of technology. This is our world today. We don’t know *how* the proliferation of networked, mobile devices will affect the systems we have in place to enable trust, but we do know it *will* affect them.

Trust is as old as our species. It’s something we do naturally, and informally. We don’t trust doctors because we’ve vetted their credentials, but because they sound learned. We don’t trust politicians because we’ve analyzed their positions, but because we generally agree with their political philosophy—or the buzzwords they use. We trust many things because our friends trust them. It’s the same with corporations, government organizations, strangers on the street: this thing that’s critical to society’s smooth functioning occurs largely through intuition and relationship. Unfortunately, these traditional and low-tech mechanisms are increasingly failing us. Understanding how trust is being, and will be, affected—probably not by predicting, but rather by recognizing effects as quickly as possible—and then deliberately creating mechanisms to induce trustworthiness and enable trust, is the only thing that will enable society to adapt.

If there’s anything I’ve learned in all my years working at the intersection of security and technology, it’s that technology is rarely more than a small piece of the solution. People are always the issue and we need to think as broadly as possible about solutions. So while laws are important, they don’t work in isolation. Much of our security comes from the informal mechanisms we’ve evolved over the millennia: systems of morals and reputation.

There will exist new regimes of trust in the information age. They simply must evolve, or society will suffer unpredictably. We have already begun fleshing out such regimes, albeit in an ad hoc manner. It’s time for us to deliberately think about how trust works in the information age, and use legal, social, and technological tools to enable this trust. We might get it right by accident, but it’ll be a long and ugly iterative process getting there if we do.

This essay was originally published in The SciTech Lawyer, Winter/Spring 2013.

Posted on February 12, 2013 at 6:53 AMView Comments

Over $3M in Prizes to Hack Google Chrome

Google’s contest at the CanSecWest conference:

Today we’re announcing our third Pwnium competition­Pwnium 3. Google Chrome is already featured in the Pwn2Own competition this year, so Pwnium 3 will have a new focus: Chrome OS.

We’ll issue Pwnium 3 rewards for Chrome OS at the following levels, up to a total of $3.14159 million USD:

  • $110,000: browser or system level compromise in guest mode or as a logged-in user, delivered via a web page.
  • $150,000: compromise with device persistence—guest to guest with interim reboot, delivered via a web page.

We believe these larger rewards reflect the additional challenge involved with tackling the security defenses of Chrome OS, compared to traditional operating systems.

News article.

Posted on February 7, 2013 at 6:35 AMView Comments

Sidebar photo of Bruce Schneier by Joe MacInnis.