Amazon's Deep Learning Service Rekognition Mistakes Congress Members For Criminals
Perhaps it needs more time in the oven.
Amazon's Rekognition service has been making a considerable amount of mistakes, it seems, and its latest set may be some of the most egregious. Recently, the American Civil Liberties Union reported that during a recent test, the service mistakenly identified 28 members of Congress as criminals.
Previously, the ACLU took a collection of a whopping 25,000 mugshots from an unnamed "public source" and had the service compare them all the photos from Congress. There were various photos simply misidentified, including civil rights activist Rep. John Lewis (D-Goergia). In fact, many of the false matches, around 38 percent of them to be exact, were actually people of color.
An Amazon spokesperson told VentureBeat that the test likely made these egregious mistakes because of some "poor calibration" due to the low "confidence threshold" utilized by the system.
“While 80 percent confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty.”
Previously, Amazon shareholders rallied together to address a letter to Amazon CEO Jeff Bezos regarding concerns over Rekognition being sold to law enforcement agencies.
“We are concerned the technology would be used to unfairly and disproportionately target and surveil people of color, immigrants, and civil society organizations … We are concerned sales may be expanded to foreign governments, including authoritarian regimes," read the letter.
If any kind of tech like this is planned to be implemented in the future, it sounds like it'll have to go through a particularly extensive metamorphosis first, indeed.
-
Brittany Vincent posted a new article, Amazon's Deep Learning Service Rekognition Mistakes Congress Members For Criminals
-
-
-
Funny story. At my work at {very large computer company}, we had a customer complain about an issue. Tech support closed the ticket (essentially) by saying it was working as designed. When I heard the story I was like "fuck that" so I contacted the platform designers and BIOS engineer and was like, "why is it designed that way, it's stupid". They're in the process of changing the behavior. Gonna take a while but I can't let that bullshit ticket closing stand.
-
-
Here's a spreadsheet of AIs that developed... "unexpected" solutions.
https://docs.google.com/spreadsheets/u/1/d/e/2PACX-1vRPiprOaC3HsCf5Tuum8bRfzYUiKLRqJmbOoC-32JorNdfyTiRRsR7Ea5eWtvsWzuxo8bjOxCG84dAg/pubhtml-
There was a recent paper where someone made a CNN to recognise and classify skin cancer from photos. It worked brilliantly on their training set, but failed disastrously in real trials, the AI sending people with weeping encrusted lesions away with a "nah ya fine mate quit ya whingin'!"
On investigation they discovered that all the serious carcinomas in their training set had rulers next to them to show scale. They'd built a very advanced ruler detector :(
-