this post was submitted on 26 Jul 2023
847 points (96.5% liked)
Technology
59300 readers
4214 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
But the AI is looking at thousands, if not millions of books, articles, comments, etc. That's what humans do as well - they draw inspiration from a variety of sources. So is sentience the distinguishing criteria for copyright? Only a being capable of original thought can create original work, and therefore anything not capable of original thought cannot create copyrighted work?
Also, irrelevant here but calling LLMs a glorified autocomplete is like calling jet engines a "glorified horse". Technically true but you're trivialising it.
Yes. Creative work is made by creative people. Writing is creative work. A computer cannot be creative, and thus generative AI is a disgusting perversion of what you wanna call “literature”. Fuck, writing and art have always been primarily about self-expression. Computers can’t express themselves with original thoughts. That’s the whole entire point. And this is why humanistic studies are important, by the way.
I absolutely agree with the second half, guided by Ian Kerr's paper "Death of the AI Author"; quoting from the abstract:
I think the part courts will struggle with is if this 'thing' is not an author of the works then it can't infringe either?
The trivialization doesn't negate the point though, and LLMs aren't intelligence.
The AI consumed all of that content and I would bet that not a single of the people who created the content were compensated, but the AI strictly on those people to produce anything coherent.
I would argue that yes, generative artificial stupidity doesn't meet the minimum bar of original thought necessary to create a standard copyrightable work unless every input has consent to be used, and laundering content through multiple generations of an LLM or through multiple distinct LLMs should not impact the need for consent.
Without full consent, it's just a massive loophole for those with money to exploit the hard work of the masses who generated all of the actual content.