Facial recognition software sold by Amazon mistakenly identified 28 members of Congress as people who had been arrested for crimes, the American Civil Liberties Union announced on Thursday.Amazon Rekognition has been marketed as tool that provides extremely accurate facial analysis through photos and video. The ACLU tested that assertion by using the software to scan photos of every current member of the House and Senate in a database that the watchdog built from thousands of publicly available arrest photos."The members of Congress who were falsely matched with the mugshot database we used in the test include Republicans and Democrats, men and women, and legislators of all ages, from all across the country," the ACLU stated.The test misidentified people of color at a high rate — 39 percent — even though they made up only 20 percent of Congress. One member falsely cited as a crime suspect was Rep. John Lewis, D-Ga., who first came to prominence as a civil rights leader.As part of the test, the ACLU said it used Amazon's default match settings.But a spokeswoman for Amazon Web Services said in an emailed statement that the ACLU should have changed those settings — and used a higher "threshold," or percentage that measures how confident Rekognition is in finding a match."While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn't be appropriate for identifying individuals with a reasonable level of certainty," she said. For law enforcement, Amazon "guides customers" to set the threshold at 95 percent or higher.ACLU of Northern California attorney Jacob Snow responded to that comment in an emailed statement: "We know from our test that Amazon makes no effort to ask users what they are using Rekognition for," he said.Snow doesn't think that changing the threshold changes the danger: "Face surveillance technology in the hands of government is primed for abuse and raises grave civil rights concerns."Outcry from privacy and civil rights groups has not stopped law enforcement from pursuing the technology. The Orlando, Fla., police force tested Rekognition's real-time surveillance. The Washington County Sheriff's Office, near Portland, Ore., has used it to search faces from photos of suspects taken by deputies."This is partly a result of vendors pushing facial recognition technology because it becomes another avenue of revenue," Jeramie Scott, national security counsel at the Electronic Privacy Information Center in Washington, D.C., told NPR. He compared facial recognition software to body cameras worn by law enforcement, which can be used for police accountability or, increasingly, public surveillance.He stressed the need for debate so that the technology doesn't become a poor solution for bad policy. "Because of the disproportionate error rate, and because of the real risk of depriving civil liberties posed by facial recognition technology, we need to have a conversation about how and when and under what circumstances this technology should be used by law enforcement, if at all." Copyright 2018 NPR. To see more, visit http://www.npr.org/.