FBI pilot program uses Amazon’s controversial rekognition face-matching software

0
606
logo


The top US police agency is testing out Amazon’s controversial facial recognition software, Rekognition, to sort through petabytes of video data. The move follows widespread protests by investors and employees about the company marketing the unproven software to police departments for pennies on the dollar.

FBI officials lamented last November that while it took agents three weeks of around-the-clock work to comb through a petabyte’s worth of surveillance data and videos in the aftermath of the 2017 mass shooting in Las Vegas, if they’d had access to Amazon’s Rekognition software the job could’ve been done “in 24 hours,” Nextgov reported Wednesday.

“Think about that,” FBI Deputy Assistant Director for Counterterrorism Christine Halvorsen said at the Amazon Web Services re:Invent conference, noting the volume of cases the agency is tasked with solving. “The cases don’t stop, the threats keep going. Being able to not pull people off that and have computers do it is very important.”

Well now the FBI is doing just that: kicking off a pilot program using the technology to sort through data while its agents go on to bigger and better things.

Last May, Sputnik reported that the artificial intelligence behind Rekognition, which can identify, track, and analyze people and recognize up to 100 faces in a single image, was being marketed by Amazon to US police departments for as little as $6 a month. That tiny fee gave law enforcement agencies access to Amazon Web Services (AWS).

In turn, Amazon requested that those agencies recommend the brand to their partners, including body camera manufacturers, according to documents obtained by the American Civil Liberties Union (ACLU).

Amazon has also provided US intelligence services their own private corner of AWS, the so-called “AWS Secret Region,” for 17 intelligence agencies to host, analyze and run applications on government data classified at the secret level via a contract with the CIA.

In July the ACLU put Rekognition to the test, finding that the software couldn’t even correctly identify members of Congress. The program misidentified 28 lawmakers, Sputnik reported, with the ACLU noting that the program misidentified black congressmembers twice as often as it did non-black lawmakers.

The news has subsequently caused dozens of corporate shareholders and senior engineers and hundreds of Amazon employees to lodge formal complaints with CEO Jeff Bezos, urging the company to cease its cooperation with police departments for fear of the technology being misused. In October, a letter by 450 employees also urged the company to take the company Palantir off AWS because it helped US Immigration and Customs Enforcement (ICE) track and deport illegal immigrants.

However, it emerged later that month that Amazon had actually been courting ICE directly since at least the summer. The Daily Beast reported October 23 on a June pitch by Amazon of Rekognition to ICE in the Silicon Valley offices of management company McKinsey & Company, which had previously partnered with ICE. The revelation was made by the Project on Government Oversight via a Freedom of Information Act request.

Further, the Washington Examiner reported last week that the FBI was also courting the National Institute of Standards and Technology’s (NIST) tattoo image-matching system, a project it had supported for four years, despite it only having a 67.9 percent accuracy rate, including false positives. The system, dubbed Tatt-E, or Tattoo Recognition Technology Evaluation, works similarly to Rekognition, pulling on a large database of images to match them via AI.

One of the major problems with these systems, web developer and technologist Chris Garaffa told Sputnik Friday, is that “there is no guarantee for innocent individuals who will inevitably be caught up in these videos.” Indeed, Tatt-E’s algorithm doesn’t even consider the possibility of a false positive.

Further, “Rekognition technology is going to allow automated, very fast review of video, with similarly automated cataloging of the faces at various times and places,” Garaffa told Sputnik, “effectively creating a database of where you were and when.”

Garaffa further noted that breaches of the AWS storage system happens on a regular basis “because of poorly-configured control over who has access,” creating real security concerns over such a sensitive database.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here