this post was submitted on 09 Aug 2023
379 points (100.0% liked)

Technology

37747 readers
196 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

In its submission to the Australian government’s review of the regulatory framework around AI, Google said that copyright law should be altered to allow for generative AI systems to scrape the internet.

you are viewing a single comment's thread
view the rest of the comments
[–] FaceDeer@kbin.social 7 points 1 year ago (1 children)

If you want "this kind of stuff" (by which I assume you mean the training of AI) to not be allowed by default, then you are basically asking for a world in which the only legal generative AIs belong to giant well-established copyright holders like Adobe and Getty. That path leads deeper underneath the boots of those ruling classes, not out from under them.

[–] andresil@lemm.ee 8 points 1 year ago* (last edited 1 year ago) (1 children)

I don't think it should be allowed to be trained off any of this stuff for entertainment/art/etc. at all. Like the dream future of AI was all the shitty boring stuff handled for us so we could sit back, chill and focus on arts, real scientific research, general individual betterment etc.

Instead we have these companies trying to get them doing all the art and interesting things whilst we all either have no job, money, or good standard of living, or the dangerous / shitty jobs.

[–] FaceDeer@kbin.social 1 points 1 year ago (1 children)

So to avoid being "under the boot of the ruling classes" you want the government to be in charge of deciding what is and is not the correct way to produce our entertainment and art?

I use Stable Diffusiuon to generate illustrations for tabletop roleplaying game adventures that I run for my friends. I use ChatGPT to brainstorm ideas for those adventures and come up with dialogue or descriptive text. How big a fine would I be facing under these laws?

[–] andresil@lemm.ee 8 points 1 year ago (1 children)

I mean there has to be a price to pay here, we can't have our cake and eat it unfortunately. Caveats like "individual use" could allow this type of use while prevent companies taking the piss.

You seem to be implying that the government is the ruling class too, which (I grant you) may at least in part be the case but at least they're voted into place. Would you rather have companies that we have no control over realistically use it without limit?

Honest question, what would you see as a fair way to handle the situation?

[–] FaceDeer@kbin.social 2 points 1 year ago (1 children)

I mean there has to be a price to pay here,

Why, because you say so?

Would you rather have companies that we have no control over realistically use it without limit?

Yes, because that means I can also use it without limit. And I see no reason to apply special restrictions to AI specifically, companies are already bound by lots of laws governing their behaviour and ultimately it's their behaviour that is what's important to control.

Honest question, what would you see as a fair way to handle the situation?

Handle it the way we already handle it. People are allowed to analyze publicly available data however they want. Training an AI is just a special case of analyzing that data, you're using a program to find patterns in it that the AI is later able to make use of when generating new material.

[–] andresil@lemm.ee 4 points 1 year ago (1 children)

Why, because you say so?

This is just being obtuse and a bit of a cunt. You can't expect not to have negative reprecusions as an affect of companies being allowed to just churn out as much AI generated shit as they can. Especially since you also say:

companies are already bound by lots of laws governing their behaviour and ultimately it's their behaviour that is what's important to control.

Please read what you've again but slowly this time. You're saying you're fine with all the other regulation, but it shouldn't be done here cause of individual liberties when i've clearly stated free use can be specifically allowed for here...

Yes, because that means I can also use it without limit.

You've again stated your problem when i've given a more than sensible solution. Individual free use is fine, why would anyone want to stop you, individually or even with your friends, being creative? The problems comes when companies with huge resources, influence, and nefarious motives decide to use it. How about this time we get ahead of it instead of letting things get out of control then trying to do something about it?

[–] FaceDeer@kbin.social 1 points 1 year ago

This is just being obtuse and a bit of a cunt.

No, I'm seriously asking. You said that there has to be a price to pay, but I really don't see why. Why can't people be free to do these things? It doesn't harm anyone else.

It's reasonable to create laws to restrict behaviour that harms other people, but that requires the person proposing those laws to show that this is actually the case. And that the restrictions placed by those laws are reasonable and proportionate, not causing more harm than they prevent.

Individual free use is fine, why would anyone want to stop you, individually or even with your friends, being creative? The problems comes when companies with huge resources, influence, and nefarious motives decide to use it.

There is no sharp dividing line between these things. What if one of the adventures I create turns out so good that I decide to publish it? What if it becomes the basis for a roleplaying system that becomes popular enough that I start a publishing company for it?

The problems comes when companies with huge resources, influence, and nefarious motives decide to use it.

How about if one of those huge companies just wants to produce some entertainment that will sell really well and that I would enjoy?

You're not really making an argument for banning AI, here. You're making an argument for banning nefariousness. That's fine, but that's kind of a bigger separate issue.