Entries Tagged "cameras"

Page 8 of 21

The Mahmoud al-Mabhouh Assassination

Remember the Mahmoud al-Mabhouh assassination last January? The police identified 30 suspects, but haven’t been able to find any of them.

Police spent about 10,000 hours poring over footage from some 1,500 security cameras around Dubai. Using face-recognition software, electronic-payment records, receipts and interviews with taxi drivers and hotel staff, they put together a list of suspects and publicized it.

Seems ubiquitous electronic surveillance is no match for a sufficiently advanced adversary.

Posted on October 12, 2010 at 6:12 AMView Comments

DHS Still Worried About Terrorists Using Internet Surveillance

Profound analysis from the Department of Homeland Security:

Detailed video obtained through live Web-based camera feeds combined with street-level and direct overhead imagery views from Internet imagery sites allow terrorists to conduct remote surveillance of multiple potential targets without exposing themselves to detection.

Cameras, too.

Remember, anyone who searches for anything on the Internet may be a terrorist. Report him immediately.

Posted on September 16, 2010 at 6:34 AMView Comments

Burglary Detection through Video Analytics

This is interesting:

Some of the scenarios where we have installed video analytics for our clients include:

  • to detect someone walking in an area of their yard (veering off of the main path) that they are not supposed to be;
  • to send an alarm if someone is standing too close to the front of a store window/front door after hours;
  • to alert security guards about someone in a parkade during specific hours;
  • to count the number of people coming into (and out of) a store during the day;

In the case of burglary prevention, getting an early warning about someone trespassing makes a huge difference for our response teams. Now, rather than waiting for a detector in the house to trip, we can receive an alarm signal while a potential burglar is still outside.

Effectiveness is going to be a question of limiting false positives.

Posted on July 14, 2010 at 12:54 PMView Comments

Filming the Police

In at least three U.S. states, it is illegal to film an active duty policeman:

The legal justification for arresting the “shooter” rests on existing wiretapping or eavesdropping laws, with statutes against obstructing law enforcement sometimes cited. Illinois, Massachusetts, and Maryland are among the 12 states in which all parties must consent for a recording to be legal unless, as with TV news crews, it is obvious to all that recording is underway. Since the police do not consent, the camera-wielder can be arrested. Most all-party-consent states also include an exception for recording in public places where “no expectation of privacy exists” (Illinois does not) but in practice this exception is not being recognized.

Massachusetts attorney June Jensen represented Simon Glik who was arrested for such a recording. She explained, “[T]he statute has been misconstrued by Boston police. You could go to the Boston Common and snap pictures and record if you want.” Legal scholar and professor Jonathan Turley agrees, “The police are basing this claim on a ridiculous reading of the two-party consent surveillance law—requiring all parties to consent to being taped. I have written in the area of surveillance law and can say that this is utter nonsense.”

The courts, however, disagree. A few weeks ago, an Illinois judge rejected a motion to dismiss an eavesdropping charge against Christopher Drew, who recorded his own arrest for selling one-dollar artwork on the streets of Chicago. Although the misdemeanor charges of not having a peddler’s license and peddling in a prohibited area were dropped, Drew is being prosecuted for illegal recording, a Class I felony punishable by 4 to 15 years in prison.

This is a horrible idea, and will make us all less secure. I wrote in 2008:

You cannot evaluate the value of privacy and disclosure unless you account for the relative power levels of the discloser and the disclosee.

If I disclose information to you, your power with respect to me increases. One way to address this power imbalance is for you to similarly disclose information to me. We both have less privacy, but the balance of power is maintained. But this mechanism fails utterly if you and I have different power levels to begin with.

An example will make this clearer. You’re stopped by a police officer, who demands to see identification. Divulging your identity will give the officer enormous power over you: He or she can search police databases using the information on your ID; he or she can create a police record attached to your name; he or she can put you on this or that secret terrorist watch list. Asking to see the officer’s ID in return gives you no comparable power over him or her. The power imbalance is too great, and mutual disclosure does not make it OK.

You can think of your existing power as the exponent in an equation that determines the value, to you, of more information. The more power you have, the more additional power you derive from the new data.

Another example: When your doctor says “take off your clothes,” it makes no sense for you to say, “You first, doc.” The two of you are not engaging in an interaction of equals.

This is the principle that should guide decision-makers when they consider installing surveillance cameras or launching data-mining programs. It’s not enough to open the efforts to public scrutiny. All aspects of government work best when the relative power between the governors and the governed remains as small as possible—when liberty is high and control is low. Forced openness in government reduces the relative power differential between the two, and is generally good. Forced openness in laypeople increases the relative power, and is generally bad.

EDITED TO ADD (7/13): Another article. One jurisdiction in Pennsylvania has explicitly ruled the opposite: that it’s legal to record police officers no matter what.

Posted on June 16, 2010 at 1:36 PMView Comments

Alerting Users that Applications are Using Cameras, Microphones, Etc.

Interesting research: “What You See is What They Get: Protecting users from unwanted use of microphones, cameras, and other sensors,” by Jon Howell and Stuart Schechter.

Abstract: Sensors such as cameras and microphones collect privacy-sensitive data streams without the user’s explicit action. Conventional sensor access policies either hassle users to grant applications access to sensors or grant with no approval at all. Once access is granted, an application may collect sensor data even after the application’s interface suggests that the sensor is no longer being accessed.

We introduce the sensor-access widget, a graphical user interface element that resides within an application’s display. The widget provides an animated representation of the personal data being collected by its corresponding sensor, calling attention to the application’s attempt to collect the data. The widget indicates whether the sensor data is currently allowed to flow to the application. The widget also acts as a control point through which the user can configure the sensor and grant or deny the application access. By building perpetual disclosure of sensor data collection into the platform, sensor-access widgets enable new access-control policies that relax the tension between the user’s privacy needs and applications’ ease of access.

Apple seems to be taking some steps in this direction with the location sensor disclosure in iPhone 4.0 OS.

Posted on May 24, 2010 at 7:32 AMView Comments

Preventing Terrorist Attacks in Crowded Areas

On the New York Times Room for Debate Blog, I—along with several other people—was asked about how to prevent terrorist attacks in crowded areas. This is my response.

In the wake of Saturday’s failed Times Square car bombing, it’s natural to ask how we can prevent this sort of thing from happening again. The answer is stop focusing on the specifics of what actually happened, and instead think about the threat in general.

Think about the security measures commonly proposed. Cameras won’t help. They don’t prevent terrorist attacks, and their forensic value after the fact is minimal. In the Times Square case, surely there’s enough other evidence—the car’s identification number, the auto body shop the stolen license plates came from, the name of the fertilizer store—to identify the guy. We will almost certainly not need the camera footage. The images released so far, like the images in so many other terrorist attacks, may make for exciting television, but their value to law enforcement officers is limited.

Check points won’t help, either. You can’t check everybody and everything. There are too many people to check, and too many train stations, buses, theaters, department stores and other places where people congregate. Patrolling guards, bomb-sniffing dogs, chemical and biological weapons detectors: they all suffer from similar problems. In general, focusing on specific tactics or defending specific targets doesn’t make sense. They’re inflexible; possibly effective if you guess the plot correctly, but completely ineffective if you don’t. At best, the countermeasures just force the terrorists to make minor changes in their tactic and target.

It’s much smarter to spend our limited counterterrorism resources on measures that don’t focus on the specific. It’s more efficient to spend money on investigating and stopping terrorist attacks before they happen, and responding effectively to any that occur. This approach works because it’s flexible and adaptive; it’s effective regardless of what the bad guys are planning for next time.

After the Christmas Day airplane bombing attempt, I was asked how we can better protect our airplanes from terrorist attacks. I pointed out that the event was a security success—the plane landed safely, nobody was hurt, a terrorist was in custody—and that the next attack would probably have nothing to do with explosive underwear. After the Moscow subway bombing, I wrote that overly specific security countermeasures like subway cameras and sensors were a waste of money.

Now we have a failed car bombing in Times Square. We can’t protect against the next imagined movie-plot threat. Isn’t it time to recognize that the bad guys are flexible and adaptive, and that we need the same quality in our countermeasures?

I know, nothing I haven’t said many times before.

Steven Simon likes cameras, although his arguments are more movie-plot than real. Michael Black, Noah Shachtman, Michael Tarr, and Jeffrey Rosen all write about the limitations of security cameras. Paul Ekman wants more people. And Richard Clarke has a nice essay about how we shouldn’t panic.

Posted on May 4, 2010 at 1:31 PMView Comments

Life Recorder

In 2006, writing about future threats on privacy, I described a life recorder:

A “life recorder” you can wear on your lapel that constantly records is still a few generations off: 200 gigabytes/year for audio and 700 gigabytes/year for video. It’ll be sold as a security device, so that no one can attack you without being recorded.

I can’t find a quote right now, but in talks I would say that this kind of technology would first be used by groups of people with diminished rights: children, soldiers, prisoners, and the non-lucid elderly.

It’s been proposed:

With GPS capabilities built into phones that can be made ever smaller, and the ability for these phones to transmit both sound and audio, isn’t it time to think about a wearable device that could be used to call for help and accurately report what was happening?

[…]

The device could contain cameras and microphones that activate if the device was triggered to create evidence that could locate an attacker and cause them to flee, an alarm sound that could help locate the victim and also help scare off an attacker, and a set of sensors that could detect everything from sudden deceleration to an irregular heartbeat or compromised breathing.

Just one sentence on the security and privacy issues:

Indeed, privacy concerns need to be addressed so that stalkers and predators couldn’t compromise the device.

Indeed.

Posted on April 19, 2010 at 6:30 AMView Comments

1 6 7 8 9 10 21

Sidebar photo of Bruce Schneier by Joe MacInnis.