$20M Cameras at New York's Freedom Tower are Pretty Sophisticated

They’re trying to detect anomalies:

If you have ever wondered how security guards can possibly keep an unfailingly vigilant watch on every single one of dozens of television monitors, each depicting a different scene, the answer seems to be (as you suspected): they can’t.

Instead, they can now rely on computers to constantly analyze the patterns, sizes, speeds, angles and motion picked up by the camera and determine—based on how they have been programmed—whether this constitutes a possible threat. In which case, the computer alerts the security guard whose own eyes may have been momentarily diverted. Or shut.

An alarm can be raised, for instance, if the computer discerns a vehicle that has been standing still for too long (say, a van in the drop-off lane of an airport terminal) or a person who is loitering while everyone else is in motion. By the same token, it will spot the individual who is moving rapidly while everyone else is shuffling along. It can spot a package that has been left behind and identify which figure in the crowd abandoned it. Or pinpoint the individual who is moving the wrong way down a one-way corridor.

Because one person’s “abnormal situation” is another person’s “hot dog vendor attracting a small crowd,” the computers can be programmed to discern between times of the day and days of the week.

Certainly interesting.

Posted on September 25, 2008 at 6:32 AM53 Comments

Comments

Bernie September 25, 2008 7:07 AM

Can the computers be programmed to tell the difference between a terrorist running to/from a target and a person who REALLY has to use the bathroom?

John Moore September 25, 2008 7:11 AM

It sounds too good to be true and likely is. I can only imagine the false positives if it isn’t tuned properly, or the false negatives if it’s tuned badly, or the alarms are just ignored because “they do that all the time”. Hopefully, the program was used by Vegas casinos before being deployed at this spot.

bob September 25, 2008 7:33 AM

Humans cant watch something continuously and remain useful. So if these things actually detect anomalies they would be pretty valuable.

However, if they constantly generate false positives then we will simply have spent a lot of money to elevate to the first derivative of the original “cant watch something continuously”.

Kieran September 25, 2008 7:34 AM

This could be a useful tool if used properly, largely for flagging stuff that might be worth looking at. And a few years of correcting false positives and negatives could result in a quite nifty little system.

Just so long as they don’t send men with guns to every false positive!

beige on beige September 25, 2008 7:43 AM

Interesting indeed. A hinkey detector?

Aren’t the very people that this is trying to detect trying to blend in?

This is a big facility. How much would normal security cameras cost? What’s the incremental cost of the new technology.

If it’s big, I hope that a pilot version was critically examined before they spend all the money.

We’ll see how well it works.

Waiting with worms on my tongue …

(baited breath … yes and I know it’s really bated)

Noble_Serf September 25, 2008 7:50 AM

Interesting? Yes.

A preventative measure? Hardly.

If the system worked well, the guards would grow weary of even the most-rare false positives. Time is on the side of the agressor. They can wait another 10 years to attempt something large-scale, or they can try it tomorrow. Smart money is on the former.

If a serious threat presented itself on the screens, I doubt the reaction could be both timely and appropriate to the threat. Deploy a whole squad to check out an abandoned shoebox? Deploy the new guy to check on two characters loitering in a van?

It makes people feel better.

It allows for prosecution of crime.

I don’t think has much to do with counter-terrorism.

Let’s work on the front-end of the problem for a while.

JD September 25, 2008 8:14 AM

Quote: “Because one person’s “abnormal situation” is another person’s “hot dog vendor attracting a small crowd,” the computers can be programmed to discern between times of the day and days of the week.”

Doesn’t this effectively introduce a deliberate ‘blind spot’? Surely it wouldn’t be too hard for a terrorist to identify these particular places and times and commit their crimes accordingly?

Muffin September 25, 2008 8:19 AM

Interesting, but for the wrong reasons. The only thing it’ll accomplish is to train the guards to ignore the alerts, since they’ll obviously be meaningless; either that, or it’ll train them to assault anyone who is standing out from the crowd.

Neither seems like a particularly desirable scenario to me.

Bryan September 25, 2008 8:43 AM

None of the examples given of things it could be programmed to detect are most likely to be indicative of threats. I could explain away every one of those examples and have done most of them. Stopped in a drop off lane too long? Rushed to the bathroom or the bus? Forgot a fast-food bag on a bench? Waited for someone at the airport or metro, standing alone for a long time while everyone else is moving?

Waste of money and only a mater of time before more innocent people find themselves at gunpoint.

JJ September 25, 2008 8:48 AM

Bryan:
only a matter of time before more innocent people find themselves at gunpoint.


yup. the direction we are going towards, I think.

Stephen Daugherty September 25, 2008 8:54 AM

My sense has always been that artificial intelligence is far in the future, but they’ve got artificial stupidity figured out pretty well.

What you need to do is hire enough security personnel to pay attention to all the monitors and real life points of interest. Human beings will be better at judging most threats for quite some time to come, and distinguishing between legitimate threats and the false positives that happen to fit a profile.

Roy September 25, 2008 9:14 AM

If it actually works as intended, it will find all the plainclothes officers, ‘outing’ them to security people in view of the public.

Timmy303 September 25, 2008 9:27 AM

Machines will always be better at detecting raw anomalies than people, but the advantage of this effectiveness is offset by their tendency to detect anomalies that don’t mean risk, hence their retarded false positive rate.

This causes more problems than it solves, and the diversion of resources away from effective measures and toward supporting security systems that don’t improve security make everyone less safe.

Clive Robinson September 25, 2008 9:28 AM

@ Bruce,

Is this the same system as was installed on the London Underground that you bloged about not so long ago?

panopticon September 25, 2008 9:33 AM

If the past is any indicator, there will be too many false positives and this system will be turned into a forensics tool looking at past events. Video analytics and AI just aren’t ready for the prime time yet.

xxx September 25, 2008 9:46 AM

@Bernie: this is a case where false positives are acceptable [if used correctly] — the goal is to alert a human to oddities (i.e. possible threats), not to detect threats.

CGomez September 25, 2008 9:50 AM

Yeah, so what if there are false positives as long as the policy is to have a well trained security guard check things out. A package has been sitting there awhile? Well, send a guard over there. Watch it a few more minutes…

I think most people see abnormal things every day, wait a few seconds to understand what is really going on, and then go about their business.

At the same time, I don’t know if it’s worth all the research. The security departments in casinos think they do a pretty good job of spotting anomalies and reacting, but I can’t honestly tell you if they have a problem with confronting people who have done nothing wrong, or what their actual loss statistics are (by definition, no one should know if a real loss occurs).

JustSomeGuy September 25, 2008 9:57 AM

The next big terrorist job involving an airport will be done by the hot dog vendor, or other insider.

leadhyena September 25, 2008 10:17 AM

It seems to me that this system can be sufficiently DDOSed if a group of intruders worked together to make more distractions than than the guards and systems are able to handle. In a normal distraction attack where Attacker A creates a distraction to draw guards so that Attacker B gets through unnoticed, guards have a small chance to suspect that A and B are colluding. With an automated system, it’s essentially an open brain: the attackers can play with it until they have a diversion plan they know will succeed, especially if the guards are leaning too much on the software.

I’m afraid that the attention enhancement this software provides will numb the intuition that a seasoned security professional can provide.

Richard September 25, 2008 10:34 AM

“it will spot the individual who is moving rapidly while everyone else is shuffling along”

Great. I am tall and have long legs. So I invariably find that I walk much faster then everyone else on the street around me. Does this mean that I’ll now find myself being stopped by the police and asked to explain why I’m walking so fast?

Sparky September 25, 2008 10:59 AM

@Richard: same here, and i really hate people “shuffling” everywhere in malls and stuff, when I just want to walk in my normal pace.

Has anyone ever found a correlation between bad intent and hinky behaviour?
I mean, all this profiling and stuff seems to be aimed at the hinky people, but what if the real terrorists just aren’t hinky, because they have practiced well, and are just trying to blend in?

bobabooey September 25, 2008 11:39 AM

i could use this $20m P.O.S. to baby sit my toddler…

does it round people up who venture outside a predefined border?

Anonymous September 25, 2008 11:49 AM

@ Sparky,

“Has anyone ever found a correlation between bad intent and hinky behaviour?”

The answer is a qualified yes.

When people are new to crime such as shoplifting (store theft) they tend to have odd movments or behaviour. That is it is their concious mind dictating the way they walk etc as opposed to their “monkey brain”

It is almost identical to spotting “new drivers” on the road.

However as the shoplifters become more experianced it becomes more of a matter of habbit and the tell tale behaviour starts to dissapear.

By definition a suicide bomber is not experianced so you might well expect this concious-v-subconcious or hinky behaviour to give them away to an experianced observer.

Which gives me a thought “Store detectives” are well practiced at spotting this hinky behaviour, so they may well be better at it than your average TSA employee.

Clive Robinson September 25, 2008 11:53 AM

I’ve done it again, the curse of the small screen has struck…

The above reply to Sparky was by me.

I vaguly remember that this Blog used to catch a blank “name” field and ask you to reconsider as to many anonymous etc etc.

@Bruce / Moderator

Can we have it back please?

George September 25, 2008 11:57 AM

@CGomez:”…but I can’t honestly tell you if they have a problem with confronting people who have done nothing wrong…”

Someone who triggers an alarm has done something wrong. It may not involve a crime or threat, and further investigation may reveal that the trigger was a false positive. But it’s something wrong by definition. That makes every false positive a “success” to be counted and proudly displayed in a PowerPoint chart to superiors as proof of effectiveness. So the guards should have no problem at all confronting anyone. Until the person who triggered the alarm is proved innocent, he is presumed guilty and dealt with accordingly.

Mark September 25, 2008 12:10 PM

You’re all bringing up good nitpicks about the system, but it seems to me that if implemented by sane people with a healthy dose of common sense (and a thorough understanding of what the system does and doesn’t do), it could do a good job of supplementing the eyes of experienced security guards.

OK, so the odds of that happening are statistically insignificant… but I think you’re all a bit quick to knock it when none of us know how well it will or could work.

Nomen Publicus September 25, 2008 12:27 PM

While I understand the symbolism of the Freedom Tower (horrible name) what is the point of such a building today.

With half-decent broadband most office jobs can be done almost anywhere. 90% of what I do each could be done from anywhere with a phone and/or broadband connection.

Why spend billions building and protecting something that is already obsolete?

Another Kevin September 25, 2008 12:28 PM

As I see it, the most worrisome problem with this scheme is that liability-conscious security staff will feel it necessary to investigate false positives. Someone has to bear the cost of those investigations – and it seems “fairest” in some sense to put the expense on those who called attention to themselves by acting suspiciously.

This seems to be another step down the road of criminalizing suspicious behaviour because it diverts societal resources in much the same way as turning in a false fire alarm. It seems a perfectly reasonable thing in almost any individual case. The people who installed the Moonintes in Boston, or Star Simpson, or the guy who fumbled his iPod into an airplane toilet, “should have known” that their actions might cause disproportionate response, and “deserve” their prosecutions. But do we really want a society where we have to scrutinize our actions for what might raise suspicion? Should I have to worry, when entering a Federal building, whether wearing tie and jacket will make me more or less suspicious? Whether a fedora, or a baseball cap, or going bareheaded, will label me as identifying with some terror group or other? Whether I’m walking at the “right” speed? Whether my knee injury will cause my gait to trip a red flag? I’ve already fallen under the “should have known better than to arouse suspicion” for simple things like entering a shopping mall, amusement park, or government building and forgetting that I have a penknife, screwdriver or flashlight in my pocket. I’ve already fallen under the “should have known better” for packing a reserve supply of my wife’s insulin syringes in my suitcase, in hopes that both our luggage doesn’t come to grief (I’m not diabetic, so I’m “not allowed”).

What level of self-scrutiny is enough? If I’m going to be a criminal for wearing the wrong clothes, the publish a uniform standard, so that I can buy accordingly. If I’m going to be a criminal for carrying the wrong tools in the wrong places, then at least put me on notice what’s allowed where. If I’m going to be a criminal for walking with a limp or having a knee brace trip a metal detector, then throw me in jail right now.

Anonymous September 25, 2008 12:49 PM

…and then they’ll replace the guards with robots which can react more quickly and with greater force… shades of RoboCop and ED-109…

alice September 25, 2008 1:28 PM

Obvious problem: terrorists will become hot dog vendors, and ply their false trade for weeks before they strike. The system will detect them every time, but the guards will be fooled by the fact that they can buy great hot dogs at good prices.

This is a classical approach: condition the guards to respond to innocuous events, then make one event not innocuous. It works because it lulls the human part of the system into thinking it’s an ordinary event every time.

David September 25, 2008 1:48 PM

Once again, a solution looking for a problem. Exactly what unique behaviors have been identified in people who, in the immediate future, commited acts of harm? (Hint: I bet the 9-11 terrorists were walking at everyone else’s speed, and probably didn’t leave any bags laying around.)

@Noble_Serf: I believe it was Albert Pujols who summed this up nicely. Paraphrasing: “yeah, I might strike out 3 or 4 times in a game, but I only have to get you once.”

Steven Hoober September 25, 2008 1:51 PM

Has anyone here actually worked in security? I’ve been an art gallery guard (back in art school) and it’s very enlightening how it works.

Most people do nothing worth noting, except that they are there. So you scan a lot. Anyone moving towards a prohibited area (e.g., too close to the art) gets an “alert,” meaning you as the guard wander over there. The patron doesn’t even know this. Only when they get several steps closer do you actually say anything.

Dozens of times per shift you move around in preparation for action. Maybe once every other shift I actually had to say anything to anyone, though. An automated system would just help notify of these alerts, these “something to watch out for” events, but it’s still up to the operator to do something.

Automation seems great here, to allow a single guard to watch a lot more area, or watch it more effectively, as something automated is catching all sorts of potentially interesting events.

Yes, we’re far from total automation, though.

Skorj September 25, 2008 2:27 PM

WHile this will likely do nothing to deter or detect terrorists, I suspect it will be quite helpful in detecting ordinary crime. Allowing guards to focus their attention on “interesting” situations is a Good Thing, as it’s nearly impossible to pay attention to uninteresting situations for long. This is not a substitute for human judgement, but a force-multiplier for it.

However, I doubt the site has enough ordinary crime to come anywhere close to justifying the expense of the system.

Bernie September 25, 2008 2:34 PM

xxx said, “this is a case where false positives are acceptable [if used correctly] — the goal is to alert a human to oddities (i.e. possible threats), not to detect threats.”

What is odd about someone running to the bathroom? I bet that, if you looked hard enough, you would find something “odd” about every person in every situation.

Anonymous September 25, 2008 3:48 PM

Might be interesting if this was used constructively. People acting unusually may have unusual needs. Perhaps we can help?

TheDoctor September 26, 2008 1:52 AM

First:
There is no terrorist threat (at least not in western countries)
Second:
Nice ripp-off of tax money

Bill September 26, 2008 4:33 AM

What’s normal? Spend half an hour at Heathrow airport (say) and it’s a very broad distribution by any measure.

Now take this system, rewind time and apply it the airports in 9/11. Did we catch the terrorists? Doubtful. Who did it catch? The man in the sombrero? The woman with irate toddler?

In my country we have pithy phrases for ‘solutions’ like these , “complete bollocks” being the most apt.

Sparky September 26, 2008 4:34 AM

Even if it’s not meant to, it will become a substitute for human judgment, and because it’s a computer system, which is basically deterministic in it’s actions, it can be gamed. Of course, human judgment can also be gamed.

Terrorists could practice acting “normally”, by having people watch them, and punishing them for detected anomalies.

I’d think the 9/11 hijackers, for example, may have practiced various situations in role-playing games.

Archangel September 26, 2008 8:24 AM

Abnormal !== hinky. Hinky is a subset of abnormal behaviors. Consider the bell curve here. Say you alert on everything outside of 2-sigma-deviation from mean. This will catch all non-average behaviors, both sides, and reduce the number of observation events dramatically (say by 2/3). Useful. It’s a force multiplier, as has been said. Hell, your computers do it for logging purposes. Every try to read a system log? How’d you like to do it with every single “event” your computer could register, dumped in along with the important stuff?

However, as has been suggested by people who engage is some of the abnormal behaviors cited, non-average behavior is frequently harmless. I’m another one who walks faster than most people, in my case because I’ve been trained from school-age (e.g. crowded passing periods) to see crowds as obstacles to me achieving my objective in a timely fashion. I love average behavior because it’s predictable, and I can navigate gaps in it. Am I up to no good? You be the judge:

Objectives on any given day, in public:
– move expeditiously from one transportation mode to another
– make scheduled pick-up times
– acquire necessary materiel (i.e. coffee, donut, books)

And I go about it looking like a weirdo, because I walk around people, quickly, and cut through places not everyone knows about to do it. Am I someone to be stopped? I bloody well hope not — I have to get to class on time, and then home so my wife can pick me up! Security intervention would be far worse than the damned suburbanites.

JohnJ September 26, 2008 10:19 AM

@ Nomen Publicus: “Freedom Tower (horrible name)”

How about Privacy Tower? Seems appropriate with the surveillance and our government’s penchant for double-speak.

Doktor Jon September 26, 2008 6:02 PM

It’s interesting that whilst there is a huge amount of money and effort being expended on developing workable Video Analytic systems, in practice there is very little informed debate and discussion.

Apart from a significant number of universities and centres of excellence that are working on cutting edge programmes, the number of purely commercial operations that are falling over themselves to get into VA is actually quite amazing.

The problem is that whilst the technology can be deployed to enhance the abilities of a well trained and motivated human operator, the fact is that in most situations the technology is not anywhere near intelligent enough, to be regarded as dependable in most mission critical situations.

About eighteen months ago, I actually spent about an hour with the senior development engineer for one of the better know VA players, and he delighted in explaining what the system was capable of achieving.

Whilst it is true that the equipments capabilities can provide a significant advantage in certain operational respects, in practice there are a number of issues which are not really being adequately considered.

At the most basic level, where VA is being used perhaps edge of network, to reduce down the amount of data being relayed and recorded, that can be a major issue within the criminal justice system, if all the bits in between are not available for court use.

Likewise, one area where VA has enormous potential but little if any development interest from the manufacturers, is in applying the systems to analysing video recordings. In situations where a very serious crime or terrorist incident are being investigated, VA can drastically reduce the time spent in analysing the recordings, if used correctly.

Like many areas of video surveillance, the technology may indeed exist and it’s development be accelerated for many well intentioned reasons, but unless it’s application is considered in context, there are indeed significant risks in relying on high tech systems, whose innate functionality could easily prove their own undoing.

Eric September 29, 2008 12:51 PM

Maybe I am naive, but what was the justification for this 20mil security camera mumbojumob? was it 9/11? And If it was would this fancy camera system be able to tell them what the guy flying the plane at the building is thinking?

Sure I can see this being good for physical security from human bodies on the ground, I just hope 9/11 wasnt used at the justification. Maybe they should mounth those anti-air mini-guns that aircraft carriers have on the building.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.