Article Lead Image

Photo via Nate Steiner/Flickr Remix by Jason Reed

The FBI is secretly storing your biometric data—and there’s nothing you can do about it

The FBI's explanation is baffling.

 

Lauren Walker

IRL

Posted on May 13, 2016   Updated on May 26, 2021, 7:08 pm CDT

The FBI doesn’t want Americans knowing if their biometric data is stored in its Next Generation Identification system, replete with finger and palm prints, iris and facial scans. And despite concerns from privacy and civil liberties groups that the Bureau is collecting this information through unsavory means, it can keep these records secret.

On May 5, the Justice Department submitted a Notice of Proposed Rulemaking (NPRM) to the federal register, publicly exempting the Federal Bureau of Investigation’s NGI system from several provisions of the Privacy Act, which requires federal agencies to share a subject’s files so that the information can be verified or corrected. According to the submission, the FBI seeks exemption because disclosing that the information exists could interfere with the Bureau’s ability to “detect, deter, and prosecute crimes.”

Why then, nearly two years after the current version of NGI was implemented, has the Justice Department now made this submission to the federal register?

“It’s possible that someone woke up and realized that they needed to do this in order to bring their actions that they were already carrying out into conformity with the law,” says Jay Stanley, a senior policy analyst at the American Civil Liberties Union (ACLU).

“It is a remarkable collection of data on Americans, and that’s potentially a lot of power to concentrate into one place.”

But almost a year ago, William McKinsey, the FBI section chief in charge of Next Generation Identification, told me that “we absolutely comply with the privacy act, down to the letter.”

So, what gives?

According to Christopher Allen, an FBI spokesman, the Bureau’s ‘proposal’ was merely a formality, as the Privacy Act already contains exemptions for intelligence and law enforcement agencies’ records systems. But if an agency intends to apply a rule, Allen explains, it is required to state so on the record. “The exemptions taken for NGI are largely the same as those taken for … most FBI systems of record,” Allen adds, though he declined to comment on the timing of the latest submission.

Trying to read the Privacy Act, particularly the exceptions, is akin to trying to decipher the Rosetta Stone. The circular contradictions and double negatives are enough to make one dizzy. But regardless of whether the FBI is correct and it is exempt from Privacy Act’s provisions, its interpretation of how it is bound by the federal law is likely to go into effect. The reason: The proposal is open to public comment for about 30 days following its submission, after which the DOJ will submit its final rule on its own proposal. Or, as Jay Stanley of the ACLU put it, “and then it does whatever it wants.”

To some, the idea of law enforcement having their biometric data in an impenetrable database may cause little concern. “I have nothing to hide,” is oft the argument. But inaccurate federal records have the power to negatively affect lives, says Stanley, such as when the FBI’s National Crime Information Center falsely identified one Maryland woman as “unsuitable” for a low-level security clearance, costing her her job. “It is a remarkable collection of data on Americans, and that’s potentially a lot of power to concentrate into one place,” Stanley says. “There are always questions about accuracy, which are now even sharper.”

This saga began nearly a decade ago when the FBI struck a $1 billion contract with Lockheed Martin, the mammoth security and aerospace corporation. Lockheed won the bid to overhaul the Bureau’s seemingly antiquated Integrated Automated Fingerprint Identification System (IAFIS) and overlay it with the biometrics carnival known as NGI. The upgrades were done in increments; Lockheed replaced the hardware in 2010 and revamped the fingerprint algorithm the following year. But in 2014, the FBI announced that the final component was complete: its facial recognition system.  

“One of our biggest concerns about NGI has been the fact that it will include non-criminal as well as criminal face images,” Jennifer Lynch, a senior staff attorney with Electronic Frontier Foundation (EFF), wrote in 2014, “and you could be implicated as a criminal suspect.”

When presented with Lynch’s concern nearly a year ago, Stephen Morris, Assistant Director of the FBI’s Criminal Justice Information Services division (CJIS), insisted that “the face pictures we maintain in NGI are mugshot photographs … they come from time of arrest.”

But in another document (the System of Records Notice) published concurrently with the ‘proposal’ on May 5, the Justice Department writes that the NGI system covers, “Individuals who have provided biometrics (e.g. palm prints, facial images)” for purposes including, “employment, licensing, military service, or volunteer service” and “immigration benefits, alien registration and naturalization, or other governmental benefits.”

Having criminal and noncriminal facial images floating around in the same system is troubling, according to the EFF, especially when one considers the quality of the images being fed into NGI and the system’s accuracy promises.

NGI is not designed to make a single positive match when biometric data is inputted to identify its human source. As Ibers of Lockheed told me last year, when it ran NGI side-by-side with its legacy system for five days, “NGI identified 910 additional matches that the [old] system did not identify.” According to documents obtained by the EFF, the FBI only ensures that “the candidate will be returned in the top 50 candidates” 85 percent of the time “when the true candidate exists in the gallery.”

As many outlets have noted, this accuracy rate pales in comparison to Facebook’s DeepFace system, which boasts a 97 percent positive match score. “The nation’s most powerful law enforcement agency is getting outgunned by a social network,” the Verge wrote. Last summer, Morris of the FBI offered the following response:

“Our ability to use facial recognition technology is strictly monitored and regulated but … when you talk about commercial industry folks, they are governed by one thing, and that’s their bottom line.

“Just because it works,” he added, “doesn’t mean we can use it.”   

And just because low resolution images “work” in NGI, EFF suggests, doesn’t mean they should be used.

In setting up NGI, the FBI partnered with many state DMVs, using license headshots to put the new facial recognition system to the test. (Yes, there’s a law enforcement exemption in the Driver’s Privacy Protection Act, too). When accessing Oregon’s facial images for quality purposes, “examiners reviewed 14,408 [images] … and found significant problems with image resolution, lighting, background and interference,” EFF writes. “Examiners also found that the median resolution of images was ‘well-below’ the recommended resolution of .75 megapixels (in comparison, newer iPhone cameras are capable of 8 megapixel resolution).”

Having criminal and noncriminal facial images floating around in the same system is troubling, according to the EFF.

The imprecision could lead to false charges, the rate of which may only increase as the database grows. Morris of the FBI said last summer “we have about 24 million images in our inter-state photo file in NGI.” This year, according to Stephen Fischer Jr., chief of multimedia productions for the FBI’s CJIS, the number has grown to 26 million.

“With great power comes great need for checks and balances,” says Stanley of the ACLU. “And what we are seeing here, instead of great checks and balances being built commensurate with the power of this database, we’re seeing the FBI seeking to wriggle out of such checks and balances.”

But the FBI says it is doing no such thing. Last year, for instance, Morris of the FBI said that NGI was built for scalability purposes, meaning a new, measurable biometric could be added at any time. “Let’s just say three, four, five years down the road there is a new biometric modality,” he said, “whether that is DNA, iris [scans] or voice or anything like that,” the FBI would have to go through the same privacy considerations that led it to be able to use facial recognition in NGI, for instance.

And in regard to the FBI’s latest proposal, Allen of the FBI adds that the the Bureau may choose to waive its exemptions in order to disclose information—that is, as long as it would “not compromise law enforcement or national security efforts.”

Share this article
*First Published: May 13, 2016, 9:00 am CDT