483
this post was submitted on 17 Aug 2023
483 points (96.2% liked)
Technology
59235 readers
3229 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I disagree. I think that there should be zero regulation of the datasets as long as the produced content is noticeably derivative, in the same way that humans can produce derivative works using other tools.
Good in theory, Problem is if your bot is given too mutch exposure to a specific piece of media and when the "creativity" value that adds random noise (and for some setups forces it to improvise) is too low, you get whatever impression the content made on the AI, like an imperfect photocopy (non expert, explained "memorization"). Too high and you get random noise.
Then it's a cheap copy, not noticeably derivative, and whoever is hosting the trained bot should probably take it down.
Then the bot is trash. Legal and non-infringing, but trash.
There is a happy medium where SD, MJ, and many other text-to-image generators currently exist. You can prompt in such a way (or exploit other vulnerabilities) to create "imperfect photocopies," but you can also create cheap, infringing works with any number of digital and physical tools.
LLM are not human, the process to train LLM is not human-like, LLM don't have human needs or desires, or rights for that matter.
comparing it to humans has been a flawed analogy since day 1.
Llm no desires = no derivative works? Let llm handle your comments they will make more sense