DOJ seeks Privacy Act exemptions for FBI’s extensive biometric database
The U.S. Department of Justice has proposed a new rule that would exempt the FBI Next Generation Identification system — purportedly the world’s largest biometric database — from key provisions of the Privacy Act. The information in the database goes far beyond mugshots and fingerprints of convicted criminals. It includes facial recognition imagery, palm prints and biographical information on everyday people who have undergone routine background checks, applied for welfare benefits or registered for immigration status. Not only that — the FBI is free to share this info with state and local agencies, private contractors and even foreign governments.
The rule change request opened a window for public comment, which has since been extended after 45 organizations signed onto a public letter expressing serious concerns.
FSRN’s Shannon Young spoke with Jeramie Scott, a national security attorney with the Electronic Privacy Information Center – EPIC – and director of its Domestic Surveillance Project.
Shannon Young: Can you you explain specifically what provisions of the Privacy Act the FBI is seeking to exempt its Next Generation Identification System from and what the implications would be for ordinary Americans?
Jeramie Scott: Sure. So, the FBI wants broad exemptions for NGI to avoid the Privacy Act requirements of accuracy, relevancy and necessity, accounting disclosures, individual access to records, and civil remedies. What that means is they’re not obligated to necessarily keep accurate records; they can collect whatever information they want, it doesn’t have to be relevant, it doesn’t have to be necessary. And they don’t have to allow individuals access to those records, so the individual can actually see whether the information kept by the FBI is accurate. And they want to exempt themselves for civil remedies. So, if the FBI ends up using inaccurate information which focuses their intention on you and disrupts your life, you no longer would have a civil remedy in that case.
SY: One of the concerns that has been raised about this database in particular is that it stores biometric information on individuals who have not been convicted of a crime. This can include people who have submitted personal information for employment-related background checks or people who have been wrongfully arrested. What current mechanisms exist for non-criminal individuals who would like to have some modicum of control over their biometric data?
JS: Legally speaking, there’s not a lot of law in place to help you control your biometric information. If you’re out in the street and someone takes a picture of a crowd, they can use facial recognition to identify you. You know, a fingerprint left in public, that can be used at will. So there’s not a lot of law there to help kind of control and set limits in terms of collecting huge stores and sharing of biometric information. And what the FBI is trying to do now is basically remove any sort of public oversight, with respect to this very, very large biometric database.
SY: Specifically in a law enforcement context, police in some jurisdictions take DNA samples from people they arrest at protests. And while the charges that provided the basis for the arrest may eventually be dropped, the biometric data of the DNA sample remains in a database. Is there a way for someone who was wrongfully arrested, for example, to make sure that data is scrubbed?
JS: In the context of an arrest, I’m not sure you can have it removed at all, potentially even in the case where you’re arrested and charges are dropped. Now, you keep in mind, that means there’s a whole lot of people who participate in civil disobedience and other protest actions who are arrested, whose information then gets entered into NGI. And that information is then kept for decades, until, I think, the person turns 100, 110, or something; it’s a very long time that they keep this information.
SY: Your organization, EPIC, sued the FBI in 2013 over access to information pertaining to the Next Generation Identification database. What were some of the key points of information that came out as a result of that legal action?
JS: One of the key documents we got were the technical specifications for NGI. And what it showed is that the FBI is willing to accept that 20 percent error rate with their facial recognition searches. And that error rate, that’s a pretty large error rate. They do have people review the results of the searches, so that means that’s a lot, potentially a lot of people who are then kind of flagged for the FBI to look at.
SY: And with that 20 percent error rate across the general population, there are also specific populations that run a higher risk for being falsely flagged, correct?
JS: Yes, this is correct. So, the error rate I’m talking about is, within the technical specifications, the FBI said, “We’re willing to accept a 20 percent error rate on the facial recognition searches that are run.” So 20 percent of the time, you know, the result they get back is not actually a match for the facial recognition search. But there has been some work done to look at the success of facial recognition across different ethnicities and with certain minority groups; facial recognition is actually worse in terms of its ability to identify a proper match.
SY: There’s a relatively new trend in law enforcement to use, or at least least experiment with, what’s known as algorithm-based policing to try to develop threat models and predict crime before it happens. How does this biometric database fit into this new technology model?
JS: It’s not completely clear how information in NGI will be used for that, but one of the red flags here is that the FBI wants to keep even information that doesn’t seem important to their investigations in the database. And one of the things that is said in the documents for the proposed rule making is that they want to be able to use information to establish patterns of activity. And one way to do that is to use algorithms, potentially, to sort through that information, to establish that pattern. And, obviously, if they’re using algorithms, it’s another area that there needs to be transparency, oversight and accountability, and these Privacy Act exemptions just make it harder for the public to provide oversight and for there to be transparency and accountability.
SY: The public now has until July 6 to submit comments on the proposed rule change. What happens after that?
JS: So after the comments are submitted, the FBI is obligated to review all those comments and actually reply in a substantive manner. That doesn’t mean they have to reply to each, single one individually, but the critical issues that are brought up in those comments, they generally need to reply to before the rule becomes final. And if they don’t provide an adequate reasoning for their exemptions and for their rule, that can actually be challenged in court.
Jeramie Scott is a national security attorney and Director of the Domestic Surveillance Project at the Electronic Privacy Information Center. He joined FSRN by phone from Washington, D.C.