51
‘We definitely messed up’: why did Google AI tool make offensive historical images?
(www.theguardian.com)
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
That is really just not relevant at all to the discussion here, but to satisfy your curiosity, I'm busy building a Lego model that a family member sent me, so the generated AI photo was supposed to depict someone that looked vaguely like me building such a Lego model. I used Bing in the past, and it has usually delivered 4 usable choices. Fact that Google gave me something that was distinctly NOT what I asked for, means it is messing with the specifics that are asked for.
Why use an AI? Just like... take a selfie
So, what you're saying is that white people shouldn't use AI?
It would appear that is exactly what I'm saying as long as the reader lacked any reading comprehension skills.
I'm not the lego person, but I am not taking that selfie because: 1) I don't want to clean the house to make it look all nice before judgey relatives critique the pic, 2) my phone is old and all its pics are kinda fish-eyed, 3) I don't actually want to spend the time doing the task right now when AI can get me an image in seconds.