this post was submitted on 12 Mar 2024
51 points (100.0% liked)

Technology

37742 readers
508 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.

you are viewing a single comment's thread
view the rest of the comments
[–] Daxtron2@startrek.website 31 points 8 months ago (1 children)

The issue is not that it can generate the images, it's that the filtering a pre prompt for Gemini was coercing the images to include forced diversity into the gens. So asking for 1940s German soldier would give you multiracial Nazis, even though that obviously doesn't make sense and it's explicitly not what was asked for.

[–] NoLifeKing@ani.social 9 points 8 months ago (1 children)
[–] Daxtron2@startrek.website 8 points 8 months ago (1 children)

It is a pretty silly scenario lol, I personally don't really care but I can understand why they implemented the safeguard but also why it's overly aggressive and needs to be tuned more.

[–] NoLifeKing@ani.social 1 points 8 months ago (2 children)

Why even put a save guard in place? Nobody needs it anyway.

[–] Kichae@lemmy.ca 15 points 8 months ago (1 children)

If you create an image generator that always returns clean cut white men whenever you ask it to produce a "doctor" or a "business man", but only ever spits out black when when you ask for a picture of someone cleaning, your PR department is going to have a bad time.

[–] t3rmit3@beehaw.org 10 points 8 months ago

And even worse, it actually reinforces that image within users.

[–] entropicdrift@lemmy.sdf.org 4 points 8 months ago (1 children)

Corporations making AI tools available to the general public are under a ton of scrutiny right now and are kinda in a "damned if you do, damned if you don't" situation. At the other extreme, if they completely uncensored it, the big controversial story would be that pedophiles are generating images of child porn or some other equally heinous shit.

These are the inevitable growing pains of a new industry with a ton of hype and PR behind it.

[–] maynarkh@feddit.nl 7 points 8 months ago

TBH it's just a byproduct of the "everything is a service, nothing is a product" age of the industry. Google is responsible for what random people do with their products.