How are they being misclassified? Sounds like the tech is working exactly as intended, by determining subjects’ biological gender features. And it’s not like you even need AI to do this; most of the time it’s also pretty obvious by looking at someone (though I guess it’s now considered insensitive to point it out) 🤦♂️
Yes, AI systems aren’t 100% accurate. I’d say 98% is pretty good though, and true failure cases can be used for training purposes to further increase accuracy. Unfortunately though, a square jaw and an Adam’s Apple remain what they are when a photo is taken. Not much you can do about that unless we start putting special RFID chips on people with inverted genders.
Okay I understand where it can get messy .but just like #loveislove , trans and non binary don't usually look different facially Right? If you could tell a gay person or a trans by their face/dressing that would be stereotypical right? How can an algorithm get something right..which is not supposed to be predicted. Can I tell by looking at a male whether he is cis, or identifies as a woman (unless he is already out and changing the way he/she dresses etc) but either way face remains same??