Exactly how did they confirm it was Bin Laden’s body?
Officials compared the DNA of the person killed at the Abbottabad compound with the bin Laden “family DNA” to determine that the 9/11 mastermind had in fact been killed, a senior administration official said.
It was not clear how many different family members’ samples were compared or whose DNA was used.
Also to identify bin Laden, a visual ID was made. There were photo comparisons and other facial recognition used to identify him, the official said. A second official said that in addition to DNA, there was full biometric analysis of facial and body features.
EDITED TO ADD (5/5): A better article.
Posted on May 5, 2011 at 12:52 PM •
Remember the Mahmoud al-Mabhouh assassination last January? The police identified 30 suspects, but haven’t been able to find any of them.
Police spent about 10,000 hours poring over footage from some 1,500 security cameras around Dubai. Using face-recognition software, electronic-payment records, receipts and interviews with taxi drivers and hotel staff, they put together a list of suspects and publicized it.
Seems ubiquitous electronic surveillance is no match for a sufficiently advanced adversary.
Posted on October 12, 2010 at 6:12 AM •
An NYU student has been reverse-engineering facial recognition algorithms to devise makeup patterns to confuse face recognition software.
Posted on April 12, 2010 at 6:08 AM •
In other biometric news, four states have banned smiling in driver’s license photographs.
The serious poses are urged by DMVs that have installed high-tech software that compares a new license photo with others that have already been shot. When a new photo seems to match an existing one, the software sends alarms that someone may be trying to assume another driver’s identity.
But there’s a wrinkle in the technology: a person’s grin. Face-recognition software can fail to match two photos of the same person if facial expressions differ in each photo, says Carnegie Mellon University robotics professor Takeo Kanade.
Posted on May 29, 2009 at 11:19 AM •
Ignore the laughable “100% accurate” claim; this is an interesting idea:
Mike Burton, Professor of Psychology at Glasgow, and lecturer Rob Jenkins say they achieved their hugely-improved results by eliminating the variable effects of age, hairstyle, expression, lighting, different camera equipment etc. This was done by producing a composite “average face” for a person, synthesised from twenty different pictures across a range of ages, lighting and so on.
Not useful when you only have one grainy photograph of your target, but interesting research nonetheless.
Posted on February 11, 2008 at 7:18 AM •
For a few months, German police tested a face recognition system. Two hundred frequent travelers volunteered to have their faces recorded and three different systems tried to recognize the faces in the crowds of a train station. Results (in German): 60% recognition at best, 30% on average (depending on light and other factors).
Posted on August 2, 2007 at 1:47 PM •
BioBouncer is a face recognition system intended for bars:
Its camera snaps customers entering clubs and bars, and facial recognition software compares them with stored images of previously identified troublemakers. The technology alerts club security to image matches, while innocent images are automatically flushed at the end of each night, Dussich said. Various clubs can share databases through a virtual private network, so belligerent drunks might find themselves unwelcome in all their neighborhood bars.
Anyone want to guess how long that “automatically flushed at the end of each night” will last? This data has enormous value. Insurance companies will want to know if someone was in a bar before a car accident. Employers will want to know if their employees were drinking before work—think airplane pilots. Private investigators will want to know who walked into a bar with whom. The police will want to know all sorts of things. Lots of people will want this data—and they’ll all be willing to pay for it.
And the data will be owned by the bars thatcollect it. They can choose to erase it, or they can choose to sell it to data aggregators like Acxiom.
It’s rarely the initial application that’s the problem. It’s the follow-on applications. It’s the function creep. Before you know it, everyone will know that they are identified the moment they walk into a commercial building. We will all lose privacy, and liberty, and freedom as a result.
Posted on February 28, 2006 at 3:47 PM •
Salon has an interesting article about parents turning to technology to monitor their children, instead of to other people in their community.
“What is happening is that parents now assume the worst possible outcome, rather than seeing other adults as their allies,” says Frank Furedi, a professor of sociology at England’s University of Kent and the author of “Paranoid Parenting.” “You never hear stories about asking neighbors to care for kids or coming together as community. Instead we become insular, privatized communities, and look for
technological solutions to what are really social problems.” Indeed, while our parents’ generation was taught to “honor thy neighbor,” the mantra for today’s kids is “stranger danger,” and the message is clear—expect the worst of anyone unfamiliar—anywhere, and at any time.
This is security based on fear, not reason. And I think people who act this way make their families less safe.
EDITED TO ADD: Here’s a link to the book Paranoid Parenting.
Posted on August 3, 2005 at 8:38 AM •
Sidebar photo of Bruce Schneier by Joe MacInnis.