Honestly, I find it ridiculous that companies like Disney want to claim ownership over the data used to train AI models. Training data isn’t duplication, it’s learning. The same way humans study, reference, and reinterpret. Trying to monetize every aspect of “influence” is a slippery slope that stifles innovation just to protect legacy revenue streams.
This isn’t about fairness, it’s about control. And the more these companies try to dominate every avenue of cultural expression, the more irrelevant they risk becoming.
Artificial neural networks are simple versions of the neurons arranged in a brain. It’s a useful solution when you know what the output should be but you don’t know what algorithm would produce it from a desired input. To claim “AI” is learning the same way as complex human brains seems a bit farfetched. If you want to say human brains are ultimately just an algorithm then fine, but look at the outputs between the two.
AI art may not look like duplication but it often looks like derived-work which could trigger copyright infringement (to my non-artist eyes). AI code on the other hand looks much closer to duplication to me and it doesn’t seem right they can use other’s code to produce code while ignoring the license because the algorithm had “learned like a human”. Many software licenses are there to protect users, rather than monopolize, and get totally ignored for profit.
“Innovative” these days seems to means new ways to fuck-over users, rather than the past where it meant products got better and/or cheaper.
Honestly, I find it ridiculous that companies like Disney want to claim ownership over the data used to train AI models. Training data isn’t duplication, it’s learning. The same way humans study, reference, and reinterpret. Trying to monetize every aspect of “influence” is a slippery slope that stifles innovation just to protect legacy revenue streams.
This isn’t about fairness, it’s about control. And the more these companies try to dominate every avenue of cultural expression, the more irrelevant they risk becoming.
Artificial neural networks are simple versions of the neurons arranged in a brain. It’s a useful solution when you know what the output should be but you don’t know what algorithm would produce it from a desired input. To claim “AI” is learning the same way as complex human brains seems a bit farfetched. If you want to say human brains are ultimately just an algorithm then fine, but look at the outputs between the two.
AI art may not look like duplication but it often looks like derived-work which could trigger copyright infringement (to my non-artist eyes). AI code on the other hand looks much closer to duplication to me and it doesn’t seem right they can use other’s code to produce code while ignoring the license because the algorithm had “learned like a human”. Many software licenses are there to protect users, rather than monopolize, and get totally ignored for profit.
“Innovative” these days seems to means new ways to fuck-over users, rather than the past where it meant products got better and/or cheaper.