complete background check

Facial Recognition Software adds “Pose Correction” – Reebok’s Face Detection System Profiles How You Shop

Bookmark and Share

The End Of Anonymity

By Erik Sofge | Article Courtesy of:  Popular Science

The End of Anonymity - Courtesy of:  Popular Science

Privacy - Skip Tracing

Technology that matches faces to names can already single out criminals.

What happens when it can identify anyone?

Detective Jim McClelland clicks a button and the grainy close-up of a suspect—bearded, expressionless, and looking away from the camera—disappears from his computer monitor. In place of the two-dimensional video still, a disembodied virtual head materializes, rendered clearly in three dimensions. McClelland rotates and tilts the head until the suspect is facing forward and his eyes stare straight out from the screen.

It’s the face of a thief, a man who had been casually walking the aisles of a convenience store in suburban Philadelphia and shopping with a stolen credit card. Police tracked the illicit sale and pulled the image from the store’s surveillance camera. The first time McClelland ran it through facial-recognition software, the results were useless. Algorithms running on distant servers produced hundreds of candidates, drawn from the state’s catalog of known criminals. But none resembled the suspect’s profile closely enough to warrant further investigation.

It wasn’t altogether surprising.

Since 2007, when McClelland and the Cheltenham Township Police Department first gained access to Pennsylvania’s face-matching system, facial-recognition software has routinely failed to produce actionable results. While mug shots face forward, subjects photographed “in the wild,” whether on the street or from a ceiling-mounted surveillance camera, rarely look directly into the lens. The detective had grown accustomed to dead ends.

But starting in 2012, the state overhauled the system and added pose-correction software, which gave McClelland and other trained officers the ability to turn a subject’s head to face the camera. While I watch over the detective’s shoulder, he finishes adjusting the thief’s face and resubmits the image. Rows of thumbnail mug shots fill the screen. McClelland points out the rank-one candidate—the image mathematically considered most similar to the one submitted.

It’s a match. The detective knows this for a fact because the suspect in question was arrested and convicted of credit card fraud last year. McClelland chose this demonstration to show me the power of new facial-recognition software, along with its potential: Armed with only a crappy screen grab, his suburban police department can now pluck a perpetrator from a combined database of 3.5 million faces.

This summer, the reach of facial-recognition software will grow further still. As part of its Next-Generation Identification (NGI) program, the FBI will roll out nationwide access to more than 16 million mug shots, and local and state police departments will contribute millions more. It’s the largest, most comprehensive database of its kind, and it will turn a relatively exclusive investigative tool into a broad capacity for law enforcement. Officers with no in-house face-matching software—the overwhelming majority—will be able to submit an image to the FBI’s servers in Clarksburg, West Virginia, where algorithms will return a ranked list of between 2 and 50 candidates.

The $1.2-billion NGI program already collects more than faces. Its repositories include fingerprints and palm prints; other biometric markers such as iris scans and vocal patterns may also be incorporated. But faces are different from most markers; they can be collected without consent or specialized equipment—any camera-phone will do the trick. And that makes them particularly ripe for abuse. If there’s any lesson to be drawn from the National Security Agency’s (NSA) PRISM scandal, in which the agency monitored millions of e-mail accounts for years, it’s that the line between protecting citizens and violating their privacy is easily blurred.

So as the FBI prepares to expand NGI across the United States, the rational response is a question: Can facial recognition create a safer, more secure world with fewer cold cases, missing children, and more criminals behind bars? And can it do so without ending anonymity for all of us?

***
The FBI’s Identification Division has been collecting data on criminals since it formed in 1924, starting with the earliest-used biometric markers—fingerprints. Gathered piecemeal at first, on endless stacks of ink-stained index cards, the bureau now maintains some 135 million digitized prints. Early forensic experts had to work by eye, matching the unique whorls and arcs of prints lifted from crime scenes to those already on file. Once computers began automating fingerprint analysis in the 1980s, the potentially months-long process was reduced to hours. Experts now call most print matching a “lights-out” operation, a job that computer algorithms can grind through while humans head home for the night.

Fingerprints don’t grow mustaches, and DNA can’t throw on a pair of sunglasses. But faces can sprout hair and sag with time.

Matching algorithms soon evolved to make DNA testing, facial recognition, and other biometric analysis possible. And as it did with fingerprints, the FBI often led the collection of new biometric markers (establishing the first national DNA database, for example, in 1994). Confidence in DNA analysis, which involves comparing 13 different chromosomal locations, is extremely high—99.99 percent of all matches are correct. Finger­print analysis for anything short of a perfect print can be less certain. The FBI says there’s an 86 percent chance of correctly matching a latent print—the typically faint or partial impression left at a crime scene—to one in its database, assuming the owner’s print is on file. That does not mean that there’s a 14 percent chance of identifying the wrong person: Both DNA and fingerprint analysis are admissible in court because they produce so few false positives.

Facial recognition, on the other hand, never identifies a subject—at best, it suggests prospects for further investigation. In part, that’s because faces are mutable. Fingerprints don’t grow mustaches, and DNA can’t throw on a pair of sunglasses. But faces can sprout hair and sag with time and circumstance. People can also resemble one another, either because they have similar features or because a low-resolution image tricks the algorithm into thinking they do.

As a result of such limitations, no system, NGI included, serves up a single, confirmed candidate but rather a range of potential matches. Face-matching software nearly always produces some sort of answer, even if it’s completely wrong. Kevin Reid, the NGI program manager, estimates that a high-quality probe—the technical term for a submitted photo—will return a correct rank-one candidate about 80 percent of the time. But that accuracy rating is deceptive. It assumes the kind of image that officers like McClelland seldom have at their disposal.

Image via:  Enquirer - Facial Recognition Database

Image via: Enquirer – Facial Recognition Database

Candid Camera: Facial Analysis In The Wild

When a shopper enters Reebok’s flagship store in New York City, a face-detection system analyzes 10 to 20 frames per second to build a profile of the potential customer.
p
The algorithms can determine a shopper’s gender and age range as well as behavioral and emotional cues, such as interest in a given display (it tracks glances and the amount of time spent standing in one place). Reebok installed the system, called Cara, in May 2013; other companies are following suit. Tesco recently unveiled a technology in the U.K. that triggers digital ads at gas stations tailored to the viewer’s age and gender. Face detection shouldn’t be confused with facial recognition. Cara extracts data from up to 25 faces at once, but it doesn’t record or match them against a database. “The images are destroyed within a fraction of a second,” says Jason Sosa, the CEO of New York–based IMRSV, which developed the software.
p
Most businesses aren’t interested in collecting your face, just the demographic info etched into it.  More… 

The End Of Anonymity

Article Courtesy of:  Popular Science

The End of Anonymity - Courtesy of:  Popular Science

Speak Your Mind

4 × two =

IP Blocking Protection is enabled by IP Address Blocker from LionScripts.com.