Entries Tagged "law enforcement"

Page 33 of 46

Surveillance Cameras that Obscure Faces

From Technology Review:

A camera developed by computer scientists at the University of California, Berkeley, would obscure, with an oval, the faces of people who appear on surveillance videos. These so-called respectful cameras, which are still in the research phase, could be used for day-to-day surveillance applications and would allow for the privacy oval to be removed from a given set of footage in the event of an investigation.

An interesting privacy-enhancing technology.

Posted on June 26, 2007 at 7:41 AMView Comments

4th Amendment Rights Extended to E-Mail

This is a great piece of news in the U.S. For the first time, e-mail has been granted the same constitutional protections as telephone calls and personal papers: the police need a warrant to get at it. Now it’s only a circuit court decision—the Sixth U.S. Circuit Court of Appeals in Ohio—it’s pretty narrowly defined based on the attributes of the e-mail system, and it has a good chance of being overturned by the Supreme Court…but it’s still great news.

The way to think of the warrant system is as a security device. The police still have the ability to get access to e-mail in order to investigate a crime. But in order to prevent abuse, they have to convince a neutral third party—a judge—that accessing someone’s e-mail is necessary to investigate that crime. That judge, at least in theory, protects our interests.

Clearly e-mail deserves the same protection as our other personal papers, but—like phone calls—it might take the courts decades to figure that out. But we’ll get there eventually.

Posted on June 25, 2007 at 4:13 PMView Comments

License Plate Cloning

It’s a growing problem in the UK:

“There are different levels of cloning. There is the simple cloning, just stealing a plate to drive into say the Congestion Charge zone or evade a speed camera.

“It ranges up to a higher level which is the car criminal who wants to sell on a stolen car.”

Tony Bullock’s car was cloned even though his plates were not physically stolen, and he was threatened with prosecution after “his” car was repeatedly caught speeding in Leicester.

He said: “It was horrendous. You are guilty until you can prove you’re not. It’s the first time that I’ve thought that English law is on its head.”

Metropolitan Police Federation chairman Glen Smyth said the problem has grown because of the amount of camera-based enforcement of traffic offences, which relies on computer records on who owns which car.

Posted on June 11, 2007 at 1:52 PMView Comments

Remote Metal Sensors Used to Detect Poachers

Interesting use of the technology, although I’m sure it has more value on the battlefield than to detect poachers.

The system consists of a network of foot-long metal detectors similar to those used in airports. When moving metal objects such as a machete or a rifle trip the sensor, it sends a radio signal to a wireless Internet gateway camouflaged in the tree canopy as far as a kilometer away. This signal is transmitted via satellite to the Internet, where the incident is logged and messages revealing the poachers’ position and direction are sent instantly to park headquarters, where patrols can then be dispatched.

Posted on June 6, 2007 at 11:06 AMView Comments

Third Party Consent and Computer Searches

U.S. courts are weighing in with opinions:

When Ray Andrus’ 91-year-old father gave federal agents permission to search his son’s password-protected computer files and they found child pornography, the case turned a spotlight on how appellate courts grapple with third-party consents to search computers.

[…]

The case was a first for the 10th U.S. Circuit Court of Appeals, and only two other circuits have touched on the issue, the 4th and 6th circuits. The 10th Circuit held that although password-protected computers command a high level of privacy, the legitimacy of a search turns on an officer’s belief that the third party had authority to consent.

The 10th Circuit’s recent 2-1 decision in U.S. v. Andrus, No. 06-3094 (April 25, 2007), recognized for the first time that a password-protected computer is like a locked suitcase or a padlocked footlocker in a bedroom. The digital locks raise the expectation of privacy by the owner. The majority nonetheless refused to suppress the evidence.

Excellent commentary from Jennifer Granick:

The Fourth Amendment generally prohibits warrantless searches of an individual’s home or possessions. There is an exception to the warrant requirement when someone consents to the search. Consent can be given by the person under investigation, or by a third party with control over or mutual access to the property being searched. Because the Fourth Amendment only prohibits “unreasonable searches and seizures,” permission given by a third party who lacks the authority to consent will nevertheless legitimize a warrantless search if the consenter has “apparent authority,” meaning that the police reasonably believed that the person had actual authority to control or use the property.

Under existing case law, only people with a key to a locked closet have apparent authority to consent to a search of that closet. Similarly, only people with the password to a locked computer have apparent authority to consent to a search of that device. In Andrus, the father did not have the password (or know how to use the computer) but the police say they did not have any reason to suspect this because they did not ask and did not turn the computer on. Then, they used forensic software that automatically bypassed any installed password.

The majority held that the police officers not only weren’t obliged to ask whether the father used the computer, they had no obligation to check for a password before performing their forensic search. In dissent, Judge Monroe G. McKay criticized the agents’ intentional blindness to the existence of password protection, when physical or digital locks are such a fundamental part of ascertaining whether a consenting person has actual or apparent authority to permit a police search. “(T)he unconstrained ability of law enforcement to use forensic software such at the EnCase program to bypass password protection without first determining whether such passwords have been enabled … dangerously sidestep(s) the Fourth Amendment.”

[…]

If courts are going to treat computers as containers, and if owners must lock containers in order to keep them private from warrantless searches, then police should be required to look for those locks. Password protected computers and locked containers are an inexact analogy, but if that is how courts are going to do it, then its inappropriate to diminish protections for computers simply because law enforcement chooses to use software that turns a blind eye to owners’ passwords.

Posted on June 5, 2007 at 6:43 AMView Comments

Recognizing "Hinky" vs. Citizen Informants

On the subject of people noticing and reporting suspicious actions, I have been espousing two views that some find contradictory. One, we are all safer if police, guards, security screeners, and the like ignore traditional profiling and instead pay attention to people acting hinky: not right. And two, if we encourage people to contact the authorities every time they see something suspicious, we’re going to waste our time chasing false alarms: foreigners whose customs are different, people who are disliked by someone, and so on.

The key difference is expertise. People trained to be alert for something hinky will do much better than any profiler, but people who have no idea what to look for will do no better than random.

Here’s a story that illustrates this: Last week, a student at the Rochester Institute of Technology was arrested with two illegal assault weapons and 320 rounds of ammunition in his dorm room and car:

The discovery of the weapons was made only by chance. A conference center worker who served in the military was walking past Hackenburg’s dorm room. The door was shut, but the worker heard the all-too-familiar racking sound of a weapon, said the center’s director Bill Gunther.

Notice how expertise made the difference. The “conference center worker” had the right knowledge to recognize the sound and to understand that it was out of place in the environment he heard it. He wasn’t primed to be on the lookout for suspicious people and things; his trained awareness kicked in automatically. He recognized hinky, and he acted on that recognition. A random person simply can’t do that; he won’t recognize hinky when he sees it. He’ll report imams for praying, a neighbor he’s pissed at, or people at random. He’ll see an English professor recycling paper, and report a Middle-Eastern-looking man leaving a box on sidewalk.

We all have some experience with this. Each of us has some expertise in some topic, and will occasionally recognize that something is wrong even though we can’t fully explain what or why. An architect might feel that way about a particular structure; an artist might feel that way about a particular painting. I might look at a cryptographic system and intuitively know something is wrong with it, well before I figure out exactly what. Those are all examples of a subliminal recognition that something is hinky—in our particular domain of expertise.

Good security people have the knowledge, skill, and experience to do that in security situations. It’s the difference between a good security person and an amateur.

This is why behavioral assessment profiling is a good idea, while the Terrorist Information and Prevention System (TIPS) isn’t. This is why training truckers to look out for suspicious things on the highways is a good idea, while a vague list of things to watch out for isn’t. It’s why this Israeli driver recognized a passenger as a suicide bomber, while an American driver probably wouldn’t.

This kind of thing isn’t easy to train. (Much has been written about it, though; Malcolm Gladwell’s Blink discusses this in detail.) You can’t learn it from watching a seven-minute video. But the more we focus on this—the more we stop wasting our airport security resources on screeners who confiscate rocks and snow globes, and instead focus them on well-trained screeners walking through the airport looking for hinky—the more secure we will be.

EDITED TO ADD (4/26): Jim Harper makes an important clarification.

Posted on April 26, 2007 at 5:43 AMView Comments

A Rant from a Cop

People use policemen as props in their personal disputes:

Noon, its 59 degrees and I get a call from a guy whose neighbor’s dog has been left in a car. I get there, the windows are cracked, and the dog has only been in there 20 minutes. It’s 59 Degrees! It’s not summer and if it were the dead of winter I’d say the car is a $20,000 dog house. But it turns out this guy has a running dispute with his neighbor so guess who he calls to irritate the guy a little more? Me. When I go to leave, the asshole that called this in yells, “hey, aren’t you gonna do anything?” I explain why I am not and he says “great, I’m writing a letter to the paper” Holy shit. Now I’m the bad guy because I didn’t embarrass your target enough for you? Grow the hell up.

When the police implement programs to let ordinary citizens report suspected terrorists, this is the kind of thing that will result.

Posted on April 25, 2007 at 1:08 PMView Comments

Cameras "Predict" Crimes

New developments from surveillance-camera-happy England:

The £7,000 device, nicknamed “the Bug”, consists of a ring of eight cameras scanning in all directions. It uses software to detect whether anybody is walking or loitering in a way that marks them out from the crowd. A ninth camera then zooms in to follow them if it thinks they are behaving suspiciously.

[…]

“The camera picks up on unusual movement, zooms in on someone and gathers evidence from a face and clothing, acting as a 24-hour operator without someone having to be there,” said Jason Butler, head of CCTV at Luton borough council. “We have kids with Asbos telling us they hate the thing because it follows them wherever they go.”

This is interesting. It moves us further along the continuum into thoughtcrimes, but near as I can tell, the system just collects evidence on people it thinks suspicious, just in case. Assuming the data is erased immediately after, it’s much less invasive than actually accosting someone for thoughtcrime; the costs for false alarms is minimal.

I doubt it works nearly as well as the article claims, but that’s likely to change in 5 to 10 years. For example, there’s a lot of research being done in the area of microfacial expressions to detect lying and other thoughts. This is the sort of technological advance that we need to be talking about in terms of security, privacy, and liberty.

Posted on April 19, 2007 at 6:20 AMView Comments

Arresting Children

A disturbing trend.

These are not the sorts of matters the police should be getting involved in. The police aren’t trained to handle children this age, and children this age don’t benefit by being fingerprinted and thrown in jail.

EDITED TO ADD (4/18): Another example:

Unfortunately, the school forgot that the clocks had switched to Daylight Saving Time that morning. The time stamps left on the hotline were adjusted by an hour after Day Light Savings causing Webb’s call to logged as the same time the bomb threat was placed. Webb, who’s never even had a detention in his life, had actually made his call an hour before the bomb threat was placed.

Despite the fact that the recording of the call featured a voice that sounded nothing like Webb’s, the police arrested Webb and he spent 12 days in a juvenile detention facility before the school eventually realised their mistake.

Posted on April 18, 2007 at 12:02 PMView Comments

1 31 32 33 34 35 46

Sidebar photo of Bruce Schneier by Joe MacInnis.