Cameras "Predict" Crimes

New developments from surveillance-camera-happy England:

The £7,000 device, nicknamed “the Bug”, consists of a ring of eight cameras scanning in all directions. It uses software to detect whether anybody is walking or loitering in a way that marks them out from the crowd. A ninth camera then zooms in to follow them if it thinks they are behaving suspiciously.

[…]

“The camera picks up on unusual movement, zooms in on someone and gathers evidence from a face and clothing, acting as a 24-hour operator without someone having to be there,” said Jason Butler, head of CCTV at Luton borough council. “We have kids with Asbos telling us they hate the thing because it follows them wherever they go.”

This is interesting. It moves us further along the continuum into thoughtcrimes, but near as I can tell, the system just collects evidence on people it thinks suspicious, just in case. Assuming the data is erased immediately after, it’s much less invasive than actually accosting someone for thoughtcrime; the costs for false alarms is minimal.

I doubt it works nearly as well as the article claims, but that’s likely to change in 5 to 10 years. For example, there’s a lot of research being done in the area of microfacial expressions to detect lying and other thoughts. This is the sort of technological advance that we need to be talking about in terms of security, privacy, and liberty.

Posted on April 19, 2007 at 6:20 AM41 Comments

Comments

greg April 19, 2007 6:42 AM

“I doubt it works nearly as well as the article claims, but that’s likely to change in 5 to 10 years”

I don’t think so. Really. We seem to have this false belive that the “truth” is there to be found. For a good crim or a messed up teenager. They are not lying. Because they belive it.

Futher the problem I have with these systems is that the data will not be deleted. And what about probable cause. Getting caught on one of these cameras might then become suffent for probable cause of commiting a crime.

I think we just need to get over the fact that sometimes bad things happen. We are just too affraid.

Clive Robinson April 19, 2007 7:02 AM

Being one of those unfortunates that live and work around London (UK) and are therefore likley to be on a CCTV around 300-400 times a day I have ambivalent thoughts about the system.

First off I predict that it will have some well publicised early successes and then become steadily less and less usefull (the norm for all CCTV Security Systems).

The up side I guess is that it will probably need less CCTV Operators who normally follow their prejudices (i.e. the Young kid with the hood over their head, or the blond woman who is wearing a short skirt etc).

As for the criminals they will learn fairly quickly how to avoid getting caught by the system so the only thing to have changed is the profit margin of another snake oil security company…

Clive Robinson April 19, 2007 7:08 AM

Oh spot the comment from the MD of the company that’s selling it,

“Stuart Thompson, managing director of Viseum, conceded that the camera might zero in on an innocent member of the public, but he denied it was intrusive, claiming that the innocent had nothing to fear.”

That’s all right then…

dlg April 19, 2007 7:13 AM

The problem are not really the cameras themselves. I wouldn’t really mind such a system very much if it were used in the way described.

The issue is more that by buying and installing these systems, we do two things: We voluntarily set up an infrastructure that can be abused by some power-hungry politician, or by an ordinary criminal (eg for blackmail), often they are the same person. There are not even laws regulating maximum storage times, minimum storage security etc. that would contain the possible damage somewhat.

Secondly, we condition everyone that constant surveillance is ok. This is probably the greater cost overall, since it degrades the stability of a (free) society.

And greg is right, how long before someone is told in court: “So if it wasn’t you, why did the Sentinel® pick you out of the crowd 2 minutes before?”

As for the technical feasibility: These systems are meant to defeat (ie pick out and record) small-time crooks, and given the state of the art, they will work pretty well even in crowds in 5 years or so. Against knowledgeable opponents, they have no chance, not now and not in 10 years.

Hieronymous Cowherd April 19, 2007 7:17 AM

“This system would spot if someone was behaving strangely: for example, by constantly changing direction or going up and down stairs.”

Which sounds exactly like the behaviour pattern of someone waiting for a date. Nice.

Of course all the critics of CCTV in the UK fail to realise that as our criminals routinely wear striped shirts, face masks and carry large sacks marked SWAG they are relatively easy to spot by camera, leaving the police free to carry on with their vital work of shooting people who look a bit foreign.

Tom m April 19, 2007 7:31 AM

Last time i checked our judicial system was based on “innocent until proven guilty.”

my mistake. obviously.

Alex April 19, 2007 7:53 AM

Even if it works and nobody cares about the “innocent till proven guilty” still there isanother problem: it targets only low level crime/security problems. Corruption, fraud and other white collar crimes – often much more damaging for society – go undetected yet again.

Leonard April 19, 2007 8:02 AM

Cool now once we learn what it considers suspicious we can create diversions and they will end up missing the real evidence.

Seriously consider the, I think it was Marines, in Iraq that kidnapped a suspect and used their knowledge of the surveillance, and its limitations, to setup the suspect and kill him in a staged firefight.

It was the perfect crime until guilt caused one of them to confess.

Bruce Schneier April 19, 2007 8:03 AM

“Hmm….to me, it sounds more like just attempting to automate looking out for someone who is acting ‘hinky'[0] In that
case, wouldn’t this be a good thing?”

This is actually a real good point. A the point where these systems are as good as a well-trained security guard looking for suspicious behavior, they will have real security value. Then, two questions remain: what are the false alarm rates, and is the data on innocents stored?

Clive Robinson April 19, 2007 8:21 AM

@Bruce

“Then, two questions remain: what are the false alarm rates, and is the data on innocents stored?”

The answer to the first is not realy relevant providing it is less than the available resources can handle.

The Second you already know the answer too in the U.K. which is,

“yes and for as long as resources alow”.

This is based on the simple premise that you don’t know when you have missed something that might be of use in the future it is the same reasoning used for keeping DNA profiles and fingerprints on file even though the person has not commited a crime.

Beta April 19, 2007 8:49 AM

How on Earth can one test a system like this? It’s relatively easy to make a system that will notice strange behavior, but how can the designer know what behavior actually correlates with crime?

It was inevitable that people would build AIs to sift through mountains of surveillance data. This twist of having an AI decide whether to gather the images of a particular passer-by in the first place is just temporary, until the memory and optics get good enough for it to record detailed video of everyone.

Spider April 19, 2007 9:09 AM

ASBO’s themselves are scary. I never heard of those befoe so I looked it up via wikipedia.

Here are some of the more rare reasons why people have been given “Anti-Social Behavior Orders”

* Two teenage boys from east Manchester forbidden to wear one golf glove.

* A 13-year-old forbidden to use the word "grass".

* A 17-year-old forbidden to use his front door. 

* An 87-year-old man ordered not to shout, swear or make "sarcastic remarks to neighbours or their visitors".

* In the centre of Manchester, a group of residents were calling for an ASBO against noisy builders on big construction sites.

* Children playing games in Grove Place Estate in Hampstead could receive ASBOs.

pointfree April 19, 2007 9:25 AM

@bruce: “Then, two questions remain: what are the false alarm rates, and is the data on innocents stored?”

two words – facial recognition, which will turn at some point (if not already, I’m no expert in this field as such) to facial fingerprinting. Whether your actual image is stored is one thing, but once your face is fingerprinted, then tracking your movements and recording them over long periods of time becomes much more achievable. There will almost certainly be ways of tying this in with formal identification – ATM’s or anywhere automated payment is made, tollways, etc, etc.

Obviously there are problems, growing or removing facial hair, glasses, hats, etc – but this wont stop people trying. Given we need photo-id for so many things these days, car license, boat license, gun license, volunteer service identification, corporate employment, etc etc – how long before these are tied and facial fingerprinting is the precursor rather than a by-product. You’ll be allowed to remove your moe, or grow a beard, but it will be illegal not to have your photo-id updated and accordingly your facial fingerprint …

Paranoid ? or are they really out to get us all ? 😉

Mike April 19, 2007 9:35 AM

“This system would spot if someone was behaving strangely: for example, by constantly changing direction or going up and down stairs.”

“Which sounds exactly like the behaviour pattern of someone waiting for a date. Nice.”

Which begs the question: Can the system track more than one person behaving strangely?

Can I send in a person to draw attention by limping down the street then mug someone while the system is misdirected?

Who decides what is “normal” anyway? It might be fun to get a couple of zombie flash mobs to go by the system and see if it can fooled into thinking zombies are normal…

Anonymous April 19, 2007 9:48 AM

Re: attempting to automate looking out for someone who is acting ‘hinky’

@ Bruce

A the point where these systems are as good as a well-trained security guard looking
for suspicious behavior, they will have real security value. Then, two questions
remain: what are the false alarm rates, and is the data on innocents stored?

Ah, you forgot the third question: what is the actual escalation process when an alarm is triggered?

Does the camera contain a loudspeaker that shouts out, “DANGER, WILL ROBINSON”? Does the camera call the local police? What is their response time? When they arrive, can they easily identify the source of the alarm? What do they do? Do they arrest everyone, just to be sure? Does triggering a camera alarm give them probable cause to search everyone in the area? Just the suspect? Will the police now interrogate the suspect? Will this turn into a “the camera says your guilty, therefore I will treat you like a convicted felon instead of a suspected citizen?” By the time these things have that fine-grained analytical ability, we may have a robotic police force. Does the ED-209 summarily execute the suspect?

False alarm rates are critical, but as we know the false alarm rates for polygraphs are pretty high, and yet they’re still treated as authoritative evidence by a lot of people.

Previous employee April 19, 2007 9:48 AM

I worked for the company who developed Viseum (Caederus in Swansea) until about 3 years ago.

As I understand it, development has continued since then on an ad-hoc basis with one part-time developer, so I doubt the system has changed all that much since I worked there.

The system was designed to track “objects” between certain speeds – a job it did pretty well (sometimes getting confused if people went behind lamp-posts, etc.). I’m guessing that it’s been extended to simply note these objects as “more interesting” when these tracked objects stop for a certain length of time. The system tries to follow the most interesting object it can see for as long as possible, or until another object becomes more interesting.

near as I can tell, the system just collects evidence on people it thinks suspicious, just in case. Assuming the data is erased immediately after, it’s much less invasive than actually accosting someone for thoughtcrime; the costs for false alarms is minimal.

That was certainly the case when I was there. Data was recorded for 28 days then discarded.

Roy April 19, 2007 10:34 AM

I could imagine a demonstration of this system. A few hundred shills walk through the field of view, then one of them acts hinky, and the system zooms in on him, to the amazement and wonder of the invited guests. “Ta-da, there it is folks. Now, start shoveling money our way.”

In a realworld application, it will be a different story. Suppose a thousand people are in the field of view, and 100 of them fire off the ‘hinky’ detector. Which 99 will be spared the closer inspection? The luck of the draw means that a malefactor is almost guaranteed to be spared.

If the sensitivity is reduced to keep the false alarms down to a tolerable level, will this help?

No, it will not. Criminal behaviors are statistical rarities. For counts, figure one act per person per second in real time. Almost every single act will then be someone doing something law-abiding, like walking the way they’re going, or standing while waiting for the bus, or window shopping.

Out of a thousand people in view, how many will actually be pickpockets? Very few, if any. Most of what a pickpocket does is expertly blend in with the crowd while surveilling it for his next victim. The actual criminal acts — the pick and the pass (to the partner) — will be barely detectable to all but the trained eye, and will be over in an instant.

Watching for exceedingly rare events guarantees that virtually all positives will be false positives, and the occasional acts of interest will usually be false negatives — undetected.

Suppose a shooting occurs out of view, and the shooter, along with dozens of people chasing after him, comes into the field of view. Which runner will the system select for closer scrutiny?

Or make it a few dozen gang members chasing after their intended victim. Which runner will the system select for closer scrutiny?

That said, the system would be very good at detecting and tracking a drunk stumbling around an empty parking lot at 3 a.m. But so would a cheap commercial VCR setup.

Previous employee April 19, 2007 10:44 AM

@Roy: You asked “Which runner will the system select for closer scrutiny?”

If everyone was running in a group it’d capture the whole group. If they split into separate groups then it’d decide which was most interesting (based on a load of configurable settings – size, speed, entering areas that had been designated as being “watch” areas etc.) and pick one to track. If in the meantime the other group became more interesting it would switch to that group instead.

Of course, it not only captures from the PTZ camera, but also all the fixed reference cameras around the camera too. These were fixed wide-angle so wouldn’t capture much in the way of detail.

(Caveat: all this is based on how the system worked a few years ago).

Richard April 19, 2007 10:47 AM

This offers further confirmation of my long held suspicion that you can get away with murder if you just behave with confidence.

Mike April 19, 2007 10:48 AM

“detect whether anybody is walking or loitering in a way that marks them out from the crowd”

Of course this couldn’t possibly>/i> be a tourist who is a bit lost, or anything like that… could it?

Previous employee April 19, 2007 10:54 AM

@Mike: You said “Of course this couldn’t possibly be a tourist who is a bit lost, or anything like that… could it?”

Yes, it certainly could, and it’d probably record that lost tourist too. But remember that this (as it stands) captures pictures which can be viewed at a later date if something happened and then gets reported to the Police, or similar. At that point there’s a timeframe to search through and it’ll have captured all the suspicious-looking events.

If nothing happens, that footage will fall by the wayside and get deleted after a pre-determined amount of time.

Archangel April 19, 2007 10:54 AM

So, you’d rather have them profile based on behavioral attributes than on anything else, because a terrorist is not an ethnicity, it’s a behavioral set determined by intent to act in a certain way.

But you’d rather not have surveillance that is based on a heuristic of pattern-violation because we’re bordering on “thoughtcrime”? Acting in an unusual fashion is most often not criminal. Acting criminally is criminal. However, criminal behavior is aberrant. Notice the lack of enforcement inherent in replacing guards with cameras. Notice the emphasis on behavioral profiling, with a system that is programmed to detect pattern violations, not skin-color violations. If you’re going to have an observation system prone to abuse, it should at least be one based on good heuristics. Punish the authorities for abusing the information, not for gathering information based on good analysis tactics.

Chris S April 19, 2007 11:20 AM

“Of course this couldn’t possibly be a tourist”

It sure could. We work in a part of the city (Toronto) that is near the waterfront, near the theatre district, near downtown, near a major residential area, near the world’s tallest tower, and near a 50,000 seat stadium. I’ve occasionally surprised co-workers by offering to help tourists find their way — before even the tourists had figured out they were lost.

In every case, they were behaving differently than both people who work in the area and tourists who were not lost.

(This part of Toronto can be particularly hard to navigate for the inexperienced tourist because it requires navigating in three dimensions. I’ve seen people lost when they were literally on top of where they wanted to go.)

SixDays April 19, 2007 12:03 PM

“Well son, the camera picked you up so you must have done something.”
Go to jail. Go directly to jail. Do not pass court. Do not collect legal council.

Derp April 19, 2007 12:27 PM

You don’t need cameras, just a whole lot of computing power.
“In fact, a crime mapping and forecasting system is already in the alpha-test stage in two American cities. With funding from the Justice Department, computer scientist Andreas Olligschlaeger, criminologist Jacqueline Cohen, and I amassed individual reports from police departments in Pittsburgh and in Rochester, New York.”
Source: Wired. Cloudy, With a Chance of Theft
http://www.wired.com/wired/archive/11.09/view.html
They were able to predict monthly criminal activity before it happened, with 80 percent accuracy. A good start. It makes cameras look like toys.

markm April 19, 2007 1:03 PM

Spider listed some very peculiar ASBO’s (the British version of the restraining order). Such orders can get quite creepy in some very different ways.

  1. Many of them I’ve read of in news stories from the UK come off sounding like, “don’t break the law (again), or I’ll be forced to speak harshly to you again.” (Not that restraining orders in the USA are much better, but American laws and court orders aren’t hollow threats so often.)

  2. The rest amount to judge-made laws imposed only on certain people. There certainly are reasons for this sometimes – e.g., if the defendant’s excuse for a long string of petty crimes is, “I was drunk,” forbidding him to drink just might prevent a recurrence, at a lower cost than prison and a better chance of him staying reformed – but I can’t imagine the reason behind most of the orders Spider cited, and the notion of a judge imposing different laws on you than on me is just plain creepy.

  3. In the USA, it is frighteningly easy to restrict a man’s rights by getting a “domestic violence” restraining order, without anything resembling due process as I know it. My daughter’s ex was calling up their kid’s preschool and frightening the staff. (He never frightened her; she’d have kicked his ass, but he did like to bully those that don’t know how to defend themselves.) She had to merely fill out a form alleging that one morning, and by 5:00 PM the court posted an order barring him from contacting or going near the preschool. She wasn’t abusing the system, but look at how easy it is to abuse a system that takes hearsay that someone else was “frightened” as evidence of wrongdoing, and gives the accused no chance to defend himself until after he has lost rights for weeks or months while waiting for a court date.

Tim April 19, 2007 1:09 PM

“I doubt it works nearly as well as the article claims, ”

Such claim as the article makes is a transparent one-sided abuse of statistics.

m April 19, 2007 2:15 PM

I heard that the kids in notingham used to run in sight of the surveillance cameras. the operators thought something was up, and sent cops after them. finally, running on the streets was more or less forbidden, the person who told me that had a hard time finding a route to go jogging.

now, you can’ forbid people to act hinky. makes a great way of spamming the system.

Jason April 19, 2007 10:40 PM

Some months ago I heard an item on the BBC World Service. The UK Government was privatizing one of the government’s military research organizations. They had someone from the organization from the Beeb telling us what great technology they could offer the public. One was a technology that could be used to detect people squirming in airline seats. The interviewee said it could be used by airline staff to spot terrorists.

The BBC Interviewer didn’t ask a single intelligent counter-question back: e.g. aren’t many people nervous when flying and what if the terrorists aren’t? Instead, they BBC Interviewer gave this guy a free run.

The BBC do this a lot, even on their supposedly tech-savvy “Digital World” program. They did a piece about voting over the Internet in Lithuania. The reporter babbled about exciting and convenient it was. She didn’t give any consideration or air-time to the prospect it could be hijacked. Given the negative publicity over Diebold’s elections in the U.S. and that this was supposedly a tech program , you think they’d have clue enough to mention it.

We need more sheeple.

Andy April 20, 2007 3:18 AM

Hmm. One tracking camera. So you have a couple of black or asian kids in hoodies loiter on the street, while their white friend mugs someone.

Utterly unconvinced.

Anonym2 April 20, 2007 3:39 AM

@m: completely agreed, and everyone should be advised to do the same…

this is just another money-making activity set up by the UK government. True, these cameras could potentially detect some criminal activity – but do you remember the case where a woman with a buggy put her bag on top of it, then walked – and she was picked up for littering because the security camera didn’t recognise her buggy?

@Derp: that sounds very much like the (BS that the) “Every Child Matters” government programme is. How can you predict crime to that level of accuracy? Sure, there are a few people in the system with a high probability of committing crime, and a few more that are part of organised crime. But what about the rest?

Is this not just another sea in the ocean of surveillance that is possible because there are companies out there that can make money out of this “opportunity”? [“opportunity” is a word Tony Blair used very much in his reply to the people against ID cards]

Ian Ringrose April 20, 2007 5:06 AM

This is a real life sorry of how surveillance-cameras made my life better on Sunday and gave me more freedom.

I live in Cambridge (UK), on Sunday it was a very nice hot day, so I decided I wish to read my book outside in a quiet place. So I cycled to the local science park, (that is private land) and set next to their lake that is about 20 feet from some of the office buildings. I set on one of the seats that were put there by the users of the building. There are no fences to keep me out, no keep-out signs, just a lot of cameras to track people, I saw that a camera was zoomed in on me to check was I was doing, this was a lot better the being disturbed by a guard asking me what I was doing then maybe telling me to leave.

There is a manned office on site that checks all the cameras and then directs the guard were he is most useful, I saw him walking along the other side of the lake, (this may have been so that I know I was being watched). Before they put in cameras, they were planning to close of the science park and put guards on all the gates.

alastair green April 20, 2007 9:08 AM

@spider
You said:
“Here are some of the more rare reasons why people have been given “Anti-Social Behavior Orders”

  • Two teenage boys from east Manchester forbidden to wear one golf glove.
  • A 13-year-old forbidden to use the word “grass”.”

I am not a fan of ASBOs in general but given that we have them, these two examples are not that unusual. The first is presumably some sort of gang symbol. The second is because grass is slang for informer. I assume this kid was intimidating someone and shouting that they were a grass. In some areas, this can have a similar affect as accusing someone of being a paedophile.

derf April 20, 2007 11:28 AM

Seems like Monty Python’s “ministry of silly walks” just got a whole new reason for being.

Pete April 23, 2007 7:22 AM

Personally I’m waiting for CCTVtube: the publicly visible archive of all the footage.

I don’t think there’s any legal obstacle to it, it’s just like those “100 best crime videos” TV programmes but without an editor.

Derp April 25, 2007 10:36 AM

How can you predict crime to that level of accuracy?

Data analysis.
http://www.icpsr.umich.edu/NACJD/SDA/das.html
http://andromeda.rutgers.edu/~wcjlen/WCJ/

By the book.
Prediction and Classification: Criminal Justice Decision Making, a collection of commissioned essays by distinguished international scholars, is the ninth volume in the Crime and Justice series. Like its predecessors, Prediction and Classification is essential reading for scholars and researchers seeking a unified source of knowledge about crime, its causes, and its cure.
http://www.press.uchicago.edu/cgi-bin/hfs.cgi/00/2396.ctl

Methods for Estimating Crime Rates of Individuals
Describes methods for analyzing offenders’ crime commission data and deriving (1) individuals’ crime commission rates and (2) rate distributions for groups of offenders with specified characteristics. Uncertain data are treated as censored observations, to obtain nonparametric maximum-likelihood estimates of the distribution of observed crime rates. No standard distributional form was found satisfactory for all crime types, and some types apparently do not occur according to a Poisson process. Shrinkage estimators of individuals’ crime commission propensities are obtained by dividing offenders into groups and shrinking data toward a regression estimate of an individual’s propensity, based on personal characteristics. A new multivariate distributional form for characterizing the joint distribution of individual crime counts is derived and fit to inmate survey data. Populations that can be surveyed (e.g., prisoners) are unrepresentative of target offender populations of primary interest. Sampling probabilities of surveyed individuals are estimated with stochastic models, allowing estimation of crime rate distributions in target populations.
http://www.rand.org/pubs/reports/R2730/

Sure, there are a few people in the system with a high probability of committing crime, and a few more that are part of organised crime. But what about the rest?

The Conception of Criminality Illustrated by a Stochastic Process Model for Deviant Behavior
http://jrc.sagepub.com/cgi/content/abstract/9/1/31?ck=nck
“Neural network algorithms are emerging nowadays as a new artificial intelligence technique that can be applied to real-life problems.” http://doi.ieeecomputersociety.org/10.1109/RISP.1992.213257

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.