this post was submitted on 09 Aug 2023
369 points (100.0% liked)
Technology
37727 readers
588 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you want "this kind of stuff" (by which I assume you mean the training of AI) to not be allowed by default, then you are basically asking for a world in which the only legal generative AIs belong to giant well-established copyright holders like Adobe and Getty. That path leads deeper underneath the boots of those ruling classes, not out from under them.
I don't think it should be allowed to be trained off any of this stuff for entertainment/art/etc. at all. Like the dream future of AI was all the shitty boring stuff handled for us so we could sit back, chill and focus on arts, real scientific research, general individual betterment etc.
Instead we have these companies trying to get them doing all the art and interesting things whilst we all either have no job, money, or good standard of living, or the dangerous / shitty jobs.
So to avoid being "under the boot of the ruling classes" you want the government to be in charge of deciding what is and is not the correct way to produce our entertainment and art?
I use Stable Diffusiuon to generate illustrations for tabletop roleplaying game adventures that I run for my friends. I use ChatGPT to brainstorm ideas for those adventures and come up with dialogue or descriptive text. How big a fine would I be facing under these laws?
I mean there has to be a price to pay here, we can't have our cake and eat it unfortunately. Caveats like "individual use" could allow this type of use while prevent companies taking the piss.
You seem to be implying that the government is the ruling class too, which (I grant you) may at least in part be the case but at least they're voted into place. Would you rather have companies that we have no control over realistically use it without limit?
Honest question, what would you see as a fair way to handle the situation?
Why, because you say so?
Yes, because that means I can also use it without limit. And I see no reason to apply special restrictions to AI specifically, companies are already bound by lots of laws governing their behaviour and ultimately it's their behaviour that is what's important to control.
Handle it the way we already handle it. People are allowed to analyze publicly available data however they want. Training an AI is just a special case of analyzing that data, you're using a program to find patterns in it that the AI is later able to make use of when generating new material.
This is just being obtuse and a bit of a cunt. You can't expect not to have negative reprecusions as an affect of companies being allowed to just churn out as much AI generated shit as they can. Especially since you also say:
Please read what you've again but slowly this time. You're saying you're fine with all the other regulation, but it shouldn't be done here cause of individual liberties when i've clearly stated free use can be specifically allowed for here...
You've again stated your problem when i've given a more than sensible solution. Individual free use is fine, why would anyone want to stop you, individually or even with your friends, being creative? The problems comes when companies with huge resources, influence, and nefarious motives decide to use it. How about this time we get ahead of it instead of letting things get out of control then trying to do something about it?
No, I'm seriously asking. You said that there has to be a price to pay, but I really don't see why. Why can't people be free to do these things? It doesn't harm anyone else.
It's reasonable to create laws to restrict behaviour that harms other people, but that requires the person proposing those laws to show that this is actually the case. And that the restrictions placed by those laws are reasonable and proportionate, not causing more harm than they prevent.
There is no sharp dividing line between these things. What if one of the adventures I create turns out so good that I decide to publish it? What if it becomes the basis for a roleplaying system that becomes popular enough that I start a publishing company for it?
How about if one of those huge companies just wants to produce some entertainment that will sell really well and that I would enjoy?
You're not really making an argument for banning AI, here. You're making an argument for banning nefariousness. That's fine, but that's kind of a bigger separate issue.