Zaya Rose ⛧
A
"Ms. Woodruff is the sixth person to report being falsely accused of a crime as a result of facial recognition technology used by police to match an unknown offender’s face to a photo in a database. All six people have been Black; Ms. Woodruff is the first woman to report it happening to her"
02:54 PM - Aug 06, 2023
Avatar Avatar Avatar
0
11
64
Build the Future
A
In response to Zaya Rose ⛧.
M/L tech is bias because data collectors are bias/prejudice. Fortunately, biased data is easier to fix than prejudice police so there’s hope. Police are not competent enough be trusted with some forms of technology. Companies providing faulty algorithms need to be sued out of existence.
11:40 PM - Aug 06, 2023
0
2
Bo Dybkær
A
In response to Zaya Rose ⛧.
That's disgusting, but sadly not surprising. A friend of mine worked for a leading Europen company in TV surveillance w. face recognition. They were perfectly aware that their systems were awful at recognizing anything other than white faces.
05:10 PM - Aug 06, 2023
0
1
Terminator LX
A
In response to Zaya Rose ⛧.
Lie detector tests aren’t considered reliable enough for the law, but facial recognition software—which is still in its infancy—is? Police should NOT be able to arrest someone based on AI alone—especially since AI has been known to have issues more processing dark skinned people.
04:29 PM - Aug 06, 2023
1
0
William Turner
A
In response to Terminator LX.
This spout was removed because the account associated with it was suspended.
06:40 PM - Aug 06, 2023
1
2
Dee Nelson
A
In response to Zaya Rose ⛧.
Reminder:

Just because something is done by a machine doesn't mean it's unbiased.

AI is programmed and given training/learning algorithms by people. Usually by white men. This affects how it functions.

Read Timnit Gebru and other AI scientists of color to learn how AI can be terribly racist.
04:21 PM - Aug 06, 2023
0
4
Craig Swanson
A
In response to Zaya Rose ⛧.
Appalling. If technology doesn’t work (rather, is not proved to work by independent verification that is quadruple redundant), it cannot be used in mission critical applications: and that means anything government, health, social service, police, etc. Some things, mistakes cannot be allowed.
04:15 PM - Aug 06, 2023
0
1
Happy Cat
A
In response to Zaya Rose ⛧.
Yikes, such an awful ordeal how frightening for her.
03:12 PM - Aug 06, 2023
0
2
Henning Dekant
A
In response to Zaya Rose ⛧.
I worked on a machine learning solution to estimate age from profile pictures, and it became quickly apparent that most available pre-trained models for facial features are not doing well with dark skin. White faces seem to be overrepresented in the standard training data sets.
03:04 PM - Aug 06, 2023
1
4
Dee Nelson
A
In response to Henning Dekant.
Yes. And this is why representation in STEM really, really matters.
04:23 PM - Aug 06, 2023
1
2

 

{{ notificationModalContent }} {{ promptModalMessage }}