this post was submitted on 09 Aug 2023
379 points (100.0% liked)

Technology

37747 readers
194 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] BlameThePeacock@lemmy.ca 2 points 1 year ago (3 children)

Your feelings don't really matter, the fact of the matter is that the goal of ai is literally to replicate the function of a human brain. The way we're building them is often mimicking the same processes.

[–] nickwitha_k@lemmy.sdf.org 7 points 1 year ago (1 children)

And LLMs and related technologies, by themselves, are artificial but not intelligent. So, the facts are not in favor of your argument to allow commercial parasitism on creative works.

[–] BlameThePeacock@lemmy.ca 2 points 1 year ago (2 children)

I think you're missing a point here. If someone uses these to models to produce and distribute copyright infringing works, the original rights holder could go after the infringer.

The model itself isn't infringing though, and the process of creating the model isn't either.

It's a similar kind of argument to the laws that protect gun manufacturers from culpability from someone using their weapon to commit a crime. The user is the one doing the bad thing, they just produce a tool.

Otherwise, could Disney go after a pencil company because someone used one of their pencils to infringe on their copyright. Even if that pencil company had designed the pencil to be extremely good at producing Disney imagery by looking at a whole bunch of Disney images and movies to make sure it matches the size, colour, etc? No, because a pencil isn't a copyright infringement of art, regardless of the process used to design it.

[–] nickwitha_k@lemmy.sdf.org 2 points 1 year ago* (last edited 1 year ago) (1 children)

Nah. You're missing the forest for the trees. Let's get abstract:

Person A makes a living by making product X and selling it.

Person B makes a living by making product Y and selling it.

Both A and B are in the same industry.

Person C uses a machine to extract the essence of product X and Y and blend them. Person C then claims authorship and sells it as product Z, which they sell in competition to X and Y.

Person C has not created anything. Their machine does not have value in the absence of products X and Y, yet received no permission, offers no credit nor compensation. In addition, they are competing for the same customers and harming the livelihoods of A and B. Person C is acting in a purely parasitic manner that cannot be seen as ethical in any widely accepted definition of the word.

[–] BlameThePeacock@lemmy.ca 1 points 1 year ago (2 children)

You're missing something even more basic.

The machine Person C has created is not infringing on anything by itself. It's creation was not an infringement. "Extracting essence" isn't a protected right provided by the copyright frameworks. Only the actual art it is used to create could infringe (which most of the generated images do not).

If the final art created is an infringement, the existing copyright system handles that situation just like an infringing piece of art created by a human. The person at fault is the person who used the machine to create an infringing work, not the creator of the machine.

In your scenario, if a human C came along and looked at the art from Person A and B, blended them together into their own style, there wouldn't be any problem either. Even though they received no permission, and offered no credit nor compensation to the original creators. They would only get in trouble if they created an actual piece of art that was too similar to either of the specific artists works and therefore found to be infringing upon the copyright.

[–] nickwitha_k@lemmy.sdf.org 2 points 1 year ago (1 children)

First, feeding something into a machine is not the same as looking at it. Person C literally creates nothing. They are a parasite. There's far more to creating than using statistical modeling algorithms. One cannot claim that that's what people studying a style and then creating someone are doing because it is empirically false.

Second, the scope of the discussion is not just "can someone legally get in trouble".

[–] BlameThePeacock@lemmy.ca 1 points 1 year ago (1 children)

"Feeding something into a machine is not the same as looking at it" Most scientists would vehemently disagree. Human brains are just a complex and squishy computer. The fact that they're biological makes no difference to how we function. Input goes in, processing occurs, output comes out. Even the term "Computer" started as a job title for a human prior to the invention of mechanical and electric devices.

The scope of the discussion is absolutely what would get you in trouble. That's literally the entire post we're commenting on. We're not arguing if this SHOULD be allowed or not, we're arguing about whether current laws prohibit it.

You keep harping on about parasites, is every person who creates a machine to do a task that competes with humans parasitical in your fucked up world logic? If we want to make a machine to build widgets, an engineer will study how widgets get built, design a machine to do it instead, produce the machine, then a company will use it to outcompete the original manual widget makers. Same process for essentially every machine we've ever invented.

[–] nickwitha_k@lemmy.sdf.org 1 points 1 year ago

"Feeding something into a machine is not the same as looking at it" Most scientists would vehemently disagree. Human brains are just a complex and squishy computer.

In that aspect, we are absolutely in agreement. We are meat computers in meat cages containing necessary support systems. That statement was, perhaps, an oversimplification.

Things like LLMs are attempts to model how the human brain works but are not identical, nor are LLMs, by themselves, capable of intelligence. If one argues contrarily that feeding data into an LLM and using it to produce something is the same, then the one using the LLM is clearly not the author and claiming so is plagiarism of the work of either the creator of the LLM or the LLM itself.

The argument that, legally, IP owners cannot specify that their works may not be used as feedstock for competing commercial products is rather absurd itself and would invalidate all but the most permissive open-source licenses as well as proprietary licenses. As pointed out elsewhere, this line of thought would allow one to steal leaked source code and use it to effectively clone existing software. Use of the source in this manner would be infringing on the owner's IP rights.

Perhaps a good way to think about LLMs is as automated reverse engineering. They take data and statistically model it in order to characterize it. There is substantial case law there and the EFF has a great FAQ on the topic: https://www.eff.org/issues/coders/reverse-engineering-faq

[–] nickwitha_k@lemmy.sdf.org 2 points 1 year ago* (last edited 1 year ago) (1 children)

The scope here is not limited to "can someone legally get in trouble under current law" (which, seems likely but is still working its way through courts). The discussion is specifically discussing ethics. Person C has created nothing. They should have no product to sell, if not for persons A and B. Their competition with those that their product is derived from is a parasitic relationship, plain and simple. They are performing an act of exploitation with measurable harm both to persons A and B but also to further development of their craft by destroying any incentive to continue it.

Now, in some sort of alternate economic system, where one's livelihood is not tied to their vocation, sure, it's possibly not problematic because the economic harm is removed. However, in current capitalist systems that are in place where LLMs are heavily hyped, it's an ethically bankrupt action to take.

ETA: No amount of mental gymnastics can change the fact that use of others' works without their consent to train a model, then claiming authorship and competing IS plainly theft of the labor that went into creating the original works.

That's not too say that LLMs and they like don't have value or often require effort to produce something worthwhile. Just that they need to be used in an ethical manner that improves the human condition, not as another tool to rob others of the fruit of their labors.

[–] BlameThePeacock@lemmy.ca 1 points 1 year ago (1 children)

I'll remind you the original article title literally contains the words "copyright law"

This discussion is entirely about legality, not ethics.

By your stupid logic, I have created nothing in my job designing automation systems, since I just look at what people currently do, program a computer to do those tasks instead, and I profit off those people no longer needing to do that job.

You want to keep everyone fully employed in needless tasks? Go join the Mennonites.

[–] nickwitha_k@lemmy.sdf.org 1 points 1 year ago

I feel that you're being deliberately obtuse here in order to avoid the ethics dilemma.

A design is a "thing", software is a "thing" even if it is physically intangible. Designing automation systems requires more than just looking at existing processes or algorthmic modeling. It requires synthetic and abstract thought. Nor is it a parasitic process; the automation has value by itself nor is it dependent upon the outputs of those whose tasks it automates. Automation, in theory, also improves the human condition by reducing amount of labor required by a given individual (though this particular good has largely been stolen since the 80s).

[–] Zapp@beehaw.org 3 points 1 year ago

The goal of AI is fictional, and there's no solid evidence today that it will ever stop being fiction.

What at have today are stupid learning algorithms that are surprisingly good at mimicing intelligent people.

The most apt comparison today is a particularly clever parrot.

I'm all for having the discussion about how to handle AI when we have it, but it's bad faith to apply it to what we have today.

Critically, what we have today will never ever go on strike, or really make any kind of correct moral decision on it's own. We must treat it like dumb automation, because it is dumb automation.

[–] acastcandream@beehaw.org 1 points 1 year ago

the fact of the matter is that the goal of AI is literally to replicate the function of a human brain

…says who? That’s absolutely your feeling and not facts.