In a world where police masquerade as google streetview cars to spy on citizens, questioning the validity of data amassed by law enforcement is absolutely necessary to protect against abuse. That’s why the Department of Justice’s proposal to prevent Americans from finding out if pictures of their fingerprints and tattoos are being stored in the FBI’s Next Generation Identification (NGI) database is unacceptable and dangerous.
Unlike other law enforcement databases that aggregate your — get ready, it’s a long list — name, home and work addresses, credit rating, websites visited, internet searches, emails sent and received, social media activity, mobile phone GPS-location data, phone call and text records, legal documents, health records, cable television shows watched and recorded, commuter toll records — whew, almost done — electronic bus and subway passes, educational records, and arrest history — ok, done — the FBI’s NGI will stockpile biometric identifiers for millions of Americans with or without their consent.
The New York Times ran a story on how police departments indiscriminately use mobile facial recognition devices in the field, including on non-consenting residents who aren’t suspected of a crime. Those photos are then uploaded to NGI.
Coupled with the fact that the FBI shares this database with “18,000 local, state, tribal and federal law enforcement agencies across the country,” it’s not hard to imagine why someone would want to know if pictures of her face are being stored indefinitely alongside suspected car thieves and bank robbers.
The FBI began systematically collecting fingerprints in 1924. Starting in 1999, the bureau went digital with the launch of its Integrated Automated Fingerprint Identification System (IAFIS). While that framework had been the largest person-centric database ever created, NGI goes miles beyond its predecessor in scope and methodology.
“Aside from criminals, suspects and detainees,” NGI will store biodata from “people fingerprinted for jobs, licenses, military or volunteer service, background checks, security clearances, and naturalization, among other government processes,” reports Nextgov.
Images and information of non-criminals would be stored alongside those of criminals, creating a massive trove of biodata that is ripe for abuse and error. This is unprecedented. As face recognition systems become the primary means by which people are electronically identified, the dangers related to storing personal information from job seekers in a database so closely associated with criminals is problematic. In fact, researchers at NYU found that the risk of falsely identifying someone increase as the size of the dataset being examined increases.
Civil liberties activists like the Electronic Frontier Foundation (EFF) have been fighting to force the FBI to release more information on how it will use its massive storehouse of face-recognition-ready photos. Check out their FOIA request here.
“Even if you have never been arrested for a crime, if your employer requires you to submit a photo as part of your background check, your face image could be searched — and you could be implicated as a criminal suspect, just by virtue of having that image in the non-criminal file,” said EFF.
This means that many people will be presented as suspects for crimes they didn’t commit. This is not how our justice system is supposed to work.
As Electronic Privacy Information Center’s Jeramie Scott explains, “If you have no ability to access the record the FBI has on you, even when you’re not part of an investigation or under investigation, and lo and behold inaccurate information forms a ‘pattern of activity’ that then subjects you to [be] the focus of the FBI, then that’s a problem.”