A standard I could see being applied is one that I think has some precedent, where if the work it is supposed to be similar to is anywhere in the training set then it's a copyright violation. One of the valid defenses against copyright claims in court is that the defendant reasonably could have been unaware of the original work, and that seems to me like a reasonable equivalent.
But humans make works that are similar to other works all the time. I just hope that we set the same standards for AI violating copyright as we have for humans. There is a big difference between derivative works and those that violate copyright.
Doesn't this argument assume that AI are human? That's a pretty huge reach if you ask me. It's not even clear if LLM are AI, nevermind giving them human rights.
Machine learning falls under the category of AI. I agree that works produced by LLMs should count as derivative works, as long as they're not too similar.
Not every work produced by a LLM should count as a derivative work—just the ones that embody unique, identifiable creative elements from specific work(s) in the training set. We don't consider every work produced by a human to be a derivative work of everything they were trained on; work produced by (a human using) an AI should be no different.
A standard I could see being applied is one that I think has some precedent, where if the work it is supposed to be similar to is anywhere in the training set then it's a copyright violation. One of the valid defenses against copyright claims in court is that the defendant reasonably could have been unaware of the original work, and that seems to me like a reasonable equivalent.
But humans make works that are similar to other works all the time. I just hope that we set the same standards for AI violating copyright as we have for humans. There is a big difference between derivative works and those that violate copyright.
Doesn't this argument assume that AI are human? That's a pretty huge reach if you ask me. It's not even clear if LLM are AI, nevermind giving them human rights.
Machine learning falls under the category of AI. I agree that works produced by LLMs should count as derivative works, as long as they're not too similar.
Not every work produced by a LLM should count as a derivative work—just the ones that embody unique, identifiable creative elements from specific work(s) in the training set. We don't consider every work produced by a human to be a derivative work of everything they were trained on; work produced by (a human using) an AI should be no different.