this post was submitted on 05 Jun 2024
664 points (100.0% liked)
LGBTQ+
2722 readers
72 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I wonder if the AI is detecting that the photo is taken from further away and below eye level which is more likely for a photo of a man, rather than looking at her facial characteristics?
Yeah, this is a valid point, if this is the exact case or not I don't know, but a lot of people don't realize a lot of the weird biases that can appear in the training data.
Like that AI trained to detect ig a mole was cancer or not. A lot of the training data that was cancer had rulers in them. So the AI learned rulers are cancerous.
I could easily see something stupid like angle the picture was taken from being something the AI erroniously assumed to be useful for determining biological sex in this case.
It's possible to manipulate an image in a way that the original and the new one are indistinguishable to the human eye, but the AI model gives completely different results.
Like this helpful graphic I found
Or... edit the HTML...
You think someone would do that? Just go on the internet and lie?