this post was submitted on 11 Jun 2024
1 points (100.0% liked)

Technology

58458 readers
4532 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Sanctus@lemmy.world 0 points 3 months ago

Its all of us whoever had an online presence I'd bet. The depth of what has been done will not come to light for a while.

[–] nothingcorporate@lemmy.world 0 points 3 months ago (1 children)

Born without my consent Used for AI training without my consent

[–] kewwwi@lemmy.world 0 points 3 months ago (1 children)

killed by AI with my consent

[–] barsquid@lemmy.world 0 points 3 months ago (1 children)

No, that one will have my full consent.

[–] otp@sh.itjust.works 0 points 3 months ago (1 children)
[–] barsquid@lemmy.world 0 points 3 months ago (2 children)
[–] ValenThyme@reddthat.com 0 points 3 months ago

i read it wrong, too!

[–] otp@sh.itjust.works 0 points 3 months ago

Haha, it happens to everyone from time to time

[–] autotldr@lemmings.world 0 points 3 months ago

This is the best summary I could come up with:


Photos of Brazilian kids—sometimes spanning their entire childhood—have been used without their consent to power AI tools, including popular image generators like Stable Diffusion, Human Rights Watch (HRW) warned on Monday.

The dataset does not contain the actual photos but includes image-text pairs derived from 5.85 billion images and captions posted online since 2008.

HRW's report warned that the removed links are "likely to be a significant undercount of the total amount of children’s personal data that exists in LAION-5B."

Han told Wired that she fears that the dataset may still be referencing personal photos of kids "from all over the world."

There is less risk that the Brazilian kids' photos are currently powering AI tools since "all publicly available versions of LAION-5B were taken down" in December, Tyler told Ars.

That decision came out of an "abundance of caution" after a Stanford University report "found links in the dataset pointing to illegal content on the public web," Tyler said, including 3,226 suspected instances of child sexual abuse material.


The original article contains 677 words, the summary contains 169 words. Saved 75%. I'm a bot and I'm open source!

[–] overload@sopuli.xyz 0 points 3 months ago (2 children)

Even if you're not on social media, you'll probably still have a shadow profile on Google/Metas servers. My 13 month old baby has a library of images searchable in Google photos and a profile photo in the app. It's convenient, but incredibly creepy.

[–] scrion@lemmy.world 0 points 3 months ago (2 children)

Yeah, why would you allow this to happen though?

[–] Rai@lemmy.dbzer0.com 0 points 3 months ago (2 children)

I want to defend that poster but I can’t disagree with you… There is one person responsible and it’s definitely not the child….

[–] Tagger@lemmy.world 0 points 3 months ago (1 children)

I'm assuming all he means is that he uses Google photos to store his pictures, so Google is the one hosting them.

[–] Mkengine@feddit.de 0 points 3 months ago

He said that it's creepy but convenient, digital privacy and laziness don't go hand in hand generally. Every week I read about another alternative for Google Photos, so the solution is not far away (three posts down I found this for example). To each their own I guess, but with such simple solutions I can't justify using Google's spyware.

[–] scrion@lemmy.world 0 points 3 months ago

And that's exactly why I commented the way I did. I'll also comment with a personal story to the original comment to further elaborate.

[–] overload@sopuli.xyz 0 points 3 months ago* (last edited 3 months ago) (2 children)

It's not opt-in as far as I'm aware. Just using Google photos makes it so. I suppose I'm deep enough in the google ecosystem (well, let's say my wife is not going to move away from it) to be desensitised to how messed up it kind of is.

I was more talking about how other people (i.e. your friends) will take photos of you and post it on social media or even just keep them in their google photos, and meta/google will build a shadow profile for you without your consent via facial recognition.

[–] 0x0@programming.dev 0 points 3 months ago (1 children)

I was more talking about how other people (i.e. your friends) will take photos of you

Friends will oblige should you ask them not to post any media of your underaged infant.

[–] treefrog@lemm.ee 0 points 3 months ago

It's not posting is the point.

Android phones back all photos up onto the Google cloud by default. Not everyone knows to turn this off.

[–] scrion@lemmy.world 0 points 3 months ago (1 children)

No, but it's opt-out, and it is your responsibility to ensure that stuff like this doesn't happen - full disclaimer, that is my personal opinion. Pictures of third parties that did not give explicit consent for each and every picture shouldn't be uploaded to cloud providers etc., let alone pictures of kids and other parties who are unable to give proper consent.

My wife is incredibly careless with these things. She wants to know how to properly operate her smartphone and wants to care about e. g. privacy, and on paper, she does - but in practice, we do a 2 hour long session, I explain all the settings to her, where to find them, why they are important, what implications certain actions / options have for security, safety and even keeping her phone in working order, yet as soon as she walks out the door, she no longer cares one bit, will blindly click to accept all kinds of EULAs and default options, never investigate what the notifications about failed backups mean, never delete obsolete / already backed up data etc. up to a point where her phone no longer works and she then instructs Google Photos to upload multiple years of family pictures full of private moments, multiple children etc. to Google.

The UI is crappy enough so you'll spend a significant amount of time deleting the pictures remotely, absolutely infuriating. I was furious, in particular because I can't say that removing the pictures will also reverse all the potential consequences of sharing all your pictures with Google.

For reference, Google Photos does offer facial recognition, stores and estimates locations and even estimates activities based on media content.

IMHO, being this negligent is not excusable in this day and age.

[–] activ8r@sh.itjust.works 0 points 3 months ago (1 children)

I agree with you mostly, and thank you for giving such a passionate and important response.

The problem is not the people though. Placing the "blame" or responsibility on the victims of this invasive behaviour is not the correct conclusion. These settings are deliberately obfuscated and people are uneducated on privacy and how it relates to technology. This is not their fault. Life is far too complicated to place yet another burden on the individual who already has so much to think about. The change needs to come from the people, yes, but it is not the people who need to change.

[–] scrion@lemmy.world 0 points 3 months ago* (last edited 3 months ago)

You are correct. It was probably not perfectly clear from my response, but I do not want to blame the individual here.

Naturally, the "Backup all my files" setting should not be opt-out, and when opting in, there should be easy and succinct explanations of what the implications are.

Lemmy as a whole is apparently a very technical community, so we often tend to forget that an understanding of these implications does not come naturally to all users, and that there are people that need a phone just like everyone else, but might not be in a position to acquire the knowledge required to make an informed decision.

I am fully with you regarding your conclusion, up to a point where I applaud regulatory action that protects customer interests, including privacy. I do not believe that companies will sort out these problems (or in any form of liberal "self regulation", really) on their own, since it's not in their interest to do so.

I guess I wanted to express that while things are obfuscated and software is full of malicious anti-patterns, we do have to take extra care to protect ourselves, and, as was the topic here, our kids. I still actively try to work on changing the current status though, politically or by making political decisions, e. g. looking at open source / projects that are more aligned with what I'd consider to be in the best interest of users, and I'd encourage everyone to do the same.

[–] DannyMac@lemm.ee 0 points 3 months ago (2 children)

Wait until you have photos spanning from, not only your child, but your cousins' children who are photographed less often. Google can easily match up an infant to the same 10 year old child. Hell, I can barely do that sometimes and have to use context clues to figure out who the infant was.

[–] dirthawker0@lemmy.world 0 points 3 months ago

I scanned a ton of my mom's family photos after she passed, and uploaded them to Google Photos. It's a bit shocking how good it is at guessing the same person at different ages, even 20+ years' difference.

[–] barsquid@lemmy.world 0 points 3 months ago (1 children)

To be fair to you, you don't have a photo library of millions of children from infant to teen to train your neurons on.

load more comments (1 replies)
[–] tal@lemmy.today 0 points 3 months ago* (last edited 3 months ago)

Kids "easily traceable" from photos used to train AI models, advocates warn.

I mean, that's true, and could be a perfectly-legitimate privacy issue, but that seems like an issue independent of training AI models. Like, doing facial recognition and such isn't really new.

Stable Diffusion or similar generative image AI stuff is pretty much the last concern I'd have over a photo of me. I'd be concerned about things like:

  • Automated inference of me being associated with other people based on facial or other recognition of us together in photos.

  • Automated tracking using recognition in video. I could totally see someone like Facebook or Google, with a huge image library, offering a service to store owners or something to automatically identify potential shoplifters if they let them run automated recognition on their store stuff. You could do mass surveillance of a whole society once you start connecting cameras and doing recognition.

  • I'm not really super-enthusiastic about use of fingerprint data for biometrics, since I've got no idea how far that is traveling. Not the end of the world, probably, but if you've been using, say, Google or Apple automated fingerprint unlocking, I don't know whether they have enough data to forge a thumbprint and authenticate as you wherever else. It's a non-revocable credential.

[–] 555@lemmy.world 0 points 3 months ago (2 children)

If you put your shit out there, someone is going to use it. Yeah, that’s not cool, I agree. But what did you think would happen?

[–] mathemachristian@lemm.ee 0 points 3 months ago (1 children)

It was the parents who did it not the kids

[–] 555@lemmy.world 0 points 3 months ago (1 children)

right, what did they think would happen?

[–] mathemachristian@lemm.ee 0 points 3 months ago (1 children)
[–] 555@lemmy.world 0 points 3 months ago (1 children)
[–] mathemachristian@lemm.ee 0 points 3 months ago (1 children)

My point is that it seems like youre disregarding how this affects the kids by saying "well the parents shouldve known better" or you think the kids deserved since they shiuldve known better and I thought I ask what your point was before making any conclusions

[–] Dkarma@lemmy.world 0 points 3 months ago

No one knows how this affects the kids, dipshit. It's brand new that's the point of the article.

[–] catloaf@lemm.ee 0 points 3 months ago

I doubt there was much thinking involved.

[–] NutWrench@lemmy.world 0 points 3 months ago (5 children)

Don't store your personal stuff online. If you want to share stuff, send it directly and encrypt it.

[–] neomachino@lemmy.dbzer0.com 0 points 3 months ago (1 children)

To a lot of people that's too much effort for "no reason".

People care, but not enough to put any effort in whatsoever.

[–] 01189998819991197253@infosec.pub 0 points 3 months ago

People care to say they care, but don't actually care at all.

[–] jorp@lemmy.world 0 points 3 months ago (1 children)

Also don't go outside or let the Google car drive by your house or have email or throw documents in the trash

[–] NutWrench@lemmy.world 0 points 3 months ago (1 children)

Just don't give companies that don't respect your privacy access to your private life. Keep your online life completely separate from your real life. It's not that difficult.

load more comments (1 replies)
load more comments (3 replies)
[–] General_Effort@lemmy.world 0 points 3 months ago (1 children)

Another rubbish hit piece on open source.

[–] ProgrammingSocks@pawb.social 0 points 3 months ago

It's not, and you don't speak for the free software community.

[–] the_doktor@lemmy.zip 0 points 3 months ago (2 children)

Where do you think AI gets all of its information?

There's nothing left to do but ban AI. If we can't even agree to this, we are absolutely lost.

[–] Gimpydude@lemmynsfw.com 0 points 3 months ago

That's just so wrong-headed. How else do you expect billionaires to monetize every aspect of our lives?

[–] extremeboredom@lemmy.world 0 points 3 months ago (12 children)

Trying to ban AI is like trying to ban math. Or staple Jello to a tree. It just doesn't work that way.

load more comments (12 replies)
[–] Dkarma@lemmy.world 0 points 3 months ago (1 children)

Lol the idea that you need consent to look at someone's publicly posted pictures is laughably wrong.

[–] Emmy@lemmy.nz 0 points 3 months ago (2 children)

View is not the same as "use in a commercial enterprise to turn a profit". Only a fool would think that's the same thing.

[–] ocassionallyaduck@lemmy.world 0 points 3 months ago

This. Anyone can view content online.

Training a visual model off those images requires feeding those images into a model, and that is not the terms under which you originally viewed them.

It's why OpenAI is currently facing tons of lawsuits it may legitimately lose in court.

Probably not though, they can just settle and pay a fee. Deep pockets.

[–] surewhynotlem@lemmy.world 0 points 3 months ago (2 children)

You're allowed to video tape in public for profit. Do we consider paying photos online to be public?

[–] Emmy@lemmy.nz 0 points 3 months ago (1 children)

You're allowed to take videos in public, yes. but someone can't then steal that video and use it for just any purpose.

There's a clear distinction

[–] surewhynotlem@lemmy.world 0 points 3 months ago

It's more like it you put it on your porch and say "free take a copy".

load more comments (1 replies)
[–] foremanguy92_@lemmy.ml 0 points 3 months ago

When you post something online it's almost as it's become a public thing like newspaper thrown in the street. Take care of your online privacy! 🏴

load more comments
view more: next ›