this post was submitted on 09 Jan 2025
457 points (96.2% liked)

Technology

60560 readers
3602 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] grue@lemmy.world 0 points 1 week ago* (last edited 1 week ago) (1 children)

I’m glad someone is sane ITT.

https://www.youtube.com/watch?v=uY9z2b85qcE

To be clear, I think it ought to be the case that at least "copyleft" GPL code can't be used to train an LLM without requiring that all output of the LLM become GPL (which, if said GPL training data were mixed with proprietary training data, would likely make the model legally unusable in total). AFAIK it's way too soon for there to be a precedent-setting court ruling about it, though.

In particular...

I thought Sony Corp. of America v. Universal City Studios, Inc. would serve as the basis there

...I don't see how this has any relevancy at all, since the whole purpose of an LLM is to make new -- arguably derivative -- works on an industrial scale, not just single copies for personal use.

[–] LainTrain@lemmy.dbzer0.com 1 points 1 week ago* (last edited 1 week ago)

...I don't see how this has any relevancy at all, since the whole purpose of an LLM is to make new -- arguably derivative -- works on an industrial scale, not just single copies for personal use.

Because it's the same basic reason that hard-drives are legal. With a hard drive and a PC I can make practically infinite copies of copyrighted material off e.g. a DVD.

Only the wrongful redistribution of such copies is an actual crime, not providing the tooling for it, otherwise Seagate or whatever HDD manufacturer would be liable for copyright infringement I commit using it's drives, so would my ISP if I distributed it etc etc.

The ruling was particularly clear: the basis for why VCRs were allowed to stay was because they had non-infringing uses. Same with hard drives.

I'm sure you see where I'm going with this. Because LLMs can allow you to generate things that (regardless of your opinion on whether all outputs are theft) wouldn't stand up to substantial similarity tests in court, they have non-infringing uses.

Therefore, it's the person who prompts and generates the image of copyright infringing material that's responsible, not the researcher, patent-holder, programmer, dataset provider, supplier or distributor of the LLM.

To be clear, I think it ought to be the case

Fair enough, thanks for the clarification.