this post was submitted on 02 Aug 2024
1 points (100.0% liked)

Technology

59587 readers
2940 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

"Suno’s training data includes essentially all music files of reasonable quality that are accessible on the open internet."

"Rather than trying to argue that Suno was not trained on copyrighted songs, the company is instead making a Fair Use argument to say that the law should allow for AI training on copyrighted works without permission or compensation."

Archived (also bypass paywall): https://archive.ph/ivTGs

you are viewing a single comment's thread
view the rest of the comments
[–] Fubarberry@sopuli.xyz 0 points 3 months ago (7 children)

There's nothing stopping you from going to youtube, listening to a bunch of hit country songs there, and using that inspiration to write a "hit country song about getting your balls caught in a screen door". That music was free to access, and your ability to create derivative works is fully protected by copyright law.

So if that's what the AI is doing, then it would be fully legal if it was a person. The question courts are trying to figure out is if AI should be treated like people when it comes to "learning" and creating derivative works.

I think there are good arguments to both sides of that issue. The big advantage of ruling against AI having those rights is that it means that record labels and other rights holders can get compensation for their content being used. The main disadvantage is that high cost barriers to training material will kill off open-source and small company AI, guaranteeing that generative AI is fully controlled by tech giant companies like Google, Microsoft, and Adobe.

I think the best legal outcome is one that attempts to protect both: companies and individuals below a certain revenue threshold (or other scale metrics) can freely train on the open web, but are required to track what was used for training. As they grow, there will be different tiers where they're required to start paying for the content their model was trained on. Obviously this solution needs a lot of work before being a viable option, but I think something similar to this is the best way to both have competition in the AI space and make sure people get compensated.

[–] moonlight@fedia.io 0 points 3 months ago (2 children)

I think the solution is just that anything AI generated should be public domain.

[–] MrSoup@lemmy.zip 0 points 3 months ago* (last edited 3 months ago) (2 children)

If you use a tool, let's say photoshop, to make an image, should it be of public domain?
Even if the user effort here is just the prompt, it's still a tool used by an user.

[–] moonlight@fedia.io 0 points 3 months ago (1 children)

If you roll a set of dice, do you own the number?

I don't think it is a tool in the same sense that image editing software is.

But if for example you use a LLM to write an outline for something and you heavily edit it, then that's transformative, and it's owned by you.

The raw output isn't yours, even though the prompt and final edited version are.

[–] Fubarberry@sopuli.xyz 0 points 3 months ago (1 children)

If you snap a photo of something, you own the photo (at least in the US).

There's a solid argument that someone doing complex AI image generation has done way more to create the final product than someone snapping a quick pic with their phone.

[–] admin@lemmy.my-box.dev 0 points 3 months ago (1 children)

One could also say that building a camera from first principles is a lot more work than entering a prompt in DALL-E, but using false equivalents isn't going up get us very far.

[–] Fubarberry@sopuli.xyz 0 points 3 months ago

I think a fairer comparison in that case would be the difficulty of building a camera vs the difficulty of building and programming an AI capable computer.

That doesn't really make sense either way though, no on is building their camera/computer from raw materials and then arguing that gives them better intellectual rights.

[–] Petter1@lemm.ee 0 points 3 months ago

Well, the AI doesn’t do all the work, you use public domain material (AI output) to create your own copyright protected product/art/thing etc.

All you have to do is put some human work into the creation. I guess the value of the result still correlates with the amount of human work one puts into a project.

[–] LodeMike@lemmy.today 0 points 3 months ago (1 children)

That's the current status quo.

[–] ClamDrinker@lemmy.world 0 points 3 months ago* (last edited 3 months ago) (1 children)

I don't know why you're being downvoted. You're absolutely correct (at least, in the US). And it seems to be based on pretty solid reasoning, so I could see a lot of other copyright offices following the same idea.

Source: https://www.copyright.gov/ai/ai_policy_guidance.pdf (See header II. The Human Authorship Requirement)

TL;DRthe Office states that “to qualify as a work of ‘authorship’ a work must be created by a human being” and that it “will not register works produced by a machine or mere mechanical process that operates randomly or automatically without any creative input or intervention from a human author.”

[–] LodeMike@lemmy.today 0 points 3 months ago

Yes. Uncopyrightable = public domain. Copyright is not the default

[–] Even_Adder@lemmy.dbzer0.com 0 points 3 months ago (1 children)

It should be fully legal because it's still a person doing it. Like Cory Doctrow said in this article:

Break down the steps of training a model and it quickly becomes apparent why it's technically wrong to call this a copyright infringement. First, the act of making transient copies of works – even billions of works – is unequivocally fair use. Unless you think search engines and the Internet Archive shouldn't exist, then you should support scraping at scale: https://pluralistic.net/2023/09/17/how-to-think-about-scraping/

Making quantitative observations about works is a longstanding, respected and important tool for criticism, analysis, archiving and new acts of creation. Measuring the steady contraction of the vocabulary in successive Agatha Christie novels turns out to offer a fascinating window into her dementia: https://www.theguardian.com/books/2009/apr/03/agatha-christie-alzheimers-research

The final step in training a model is publishing the conclusions of the quantitative analysis of the temporarily copied documents as software code. Code itself is a form of expressive speech – and that expressivity is key to the fight for privacy, because the fact that code is speech limits how governments can censor software: https://www.eff.org/deeplinks/2015/04/remembering-case-established-code-speech/

That's all these models are, someone's analysis of the training data in relation to each other, not the data itself. I feel like this is where most people get tripped up. Understanding how these things work makes it all obvious.

[–] Petter1@lemm.ee 0 points 3 months ago

I think AI should be allowed ti use any available data, but it has to be made freely available e.g. by making it downloadable on huggingface

[–] YeetPics@mander.xyz 0 points 3 months ago (1 children)

Inspiration ≠ mathematically derived similarity.

These aren't artists giving their own rendering, these are venture capitalists using shiny tools to steal other people hard work.

[–] redisdead@lemmy.world 0 points 3 months ago (1 children)
[–] falcunculus@jlai.lu -1 points 3 months ago

"4 chords" is a cool mashup but it's not really a valid point in this conversation.

The songs in "4 chords" don't use the same 4 chords, because they are higher and lower than that. So you might say they use the same progression, but that's not true either, because they're not always constantly in the same order. So the best you can say is "it's possible to interpret pitch- and tempo-adjusted excerpts of these songs back-to-back", which isn't a very strong claim.

In fact there's a lot of things separating the songs in "4 chords"; such as structure, arrangement, rhythm, lyrics, or production. Another fact is that it's perfectly possible to use these four chords in a way that you've never heard before and would likely find bizarre -- it's a bit of meme, but limitation really can breed creativity.

This isn't to defend the lack of creativity in the big music industry. But there's more to it than just saying "4 chords" to imply all musicians do is follow an established grid.

[–] Petter1@lemm.ee 0 points 3 months ago

Yea, I think only big media corporations would profit from such a copyright rule Average Joe’s creations will be scraped because he has no funding to prove and sue those big AI corporations

[–] ColeSloth@discuss.tchncs.de 0 points 3 months ago

Actually, that music was based off of getting royalties and ad viewership. No one will pay for an ai to be exposed to an ad or pay royalties for an ai to hear a song. Or have an ai to hear a song for the chance of the ai buying merchandise or a concert ticket.

I'm reminded of the Blue Man Group's Complex Rock Tour. One of the major themes of the show is the contradiction in terms that is the "music industry." That we tend to think of music as an artistic, ethereal thing that requires talent and inspiration...and yet we churn out pop music the same way we churn out cars and smart phones.

[–] Cagi@lemmy.ca 0 points 3 months ago* (last edited 3 months ago) (5 children)

Taking other people's creative works to create your own for-profit product is illegal in every way except when AI does it. AI is not a person watching videos. AI is a product using others' content as its bricks and mortar. Thousands of hours of work on a project you completed being used by someone else to turn sa profit, maybe even used in some way you vehemently disagree with, without giving you a dime is unethical and needs regulation from that perspective.

[–] Fubarberry@sopuli.xyz 0 points 3 months ago

That's covered by section 107 of the US copyright law, and is actually fine and protected as free use in most cases. As long as the work is a direct copy and instead changes the result to be something different.

All parody type music is protected in this way, whether it's new lyrics to a song, or even something less "creative" like performing the lyrics of song A to the melody and style of song B.

[–] antler@feddit.rocks 0 points 3 months ago (1 children)

Taking other people's creative works to create your own for-profit product is illegal in every way except when AI does it.

No, actually its completely legal to consume content that was uploaded to the internet and then use it as inspiration to create your own works.

[–] Switchy85@sh.itjust.works 0 points 3 months ago (1 children)

Algorithms don't have "inspiration".

[–] Zacryon@feddit.org 0 points 3 months ago

What is "inspiration" in your opinion and how would that differ from machine learning algorithms?

[–] Gutless2615@ttrpg.network 0 points 3 months ago

Taking other people’s creative works to create your own productive work is allowed if you are making a fair use. There’s a very good argument that use such as training a model on a work would be a fair use under the current test; being a transformative use, that replicates practically no actual part of the original piece in the finished work, that (arguably) does not serve as a replacement for that specific piece in the market.

Fair use is the cornerstone of remix art, of fan art, of huge swathes of musical genres. What we are witnessing is the birth of a new technique based on remixing and unfortunately this time around people are convinced that fighting on the side of big copyright is somehow the good thing for artists.

[–] commie@lemmy.dbzer0.com 0 points 3 months ago

Thousands of hours of work on a project you completed being used by someone else to turn a profit, maybe even used in some way you vehemently disagree with, without giving you a dime is

exactly how human culture progresses, and trying to stop it

is unethical and needs regulation from that perspective.

[–] commie@lemmy.dbzer0.com 0 points 3 months ago (1 children)

Taking other people’s creative works to create your own for-profit product is illegal in every way except when AI does it.

wrong.

[–] Quill7513@slrpnk.net 0 points 3 months ago (2 children)

You're literally on a piracy server. You know about the laws and how hard the corpos crack down on us. Why the fuck are you licking the boots now

[–] commie@lemmy.dbzer0.com 0 points 3 months ago* (last edited 3 months ago)

i'm principled, so i know that copying content is good and stopping people from copying is bad.

[–] commie@lemmy.dbzer0.com 0 points 3 months ago (1 children)

you're wrong on the facts, this has nothing to do with supporting corporations.

[–] YeetPics@mander.xyz 0 points 3 months ago