Skip Navigation

Why don't people get that AI copyright fuzzing is bad?

Speaking as a creative who also has gotten paid for creative work, I'm a bit flustered at how brazenly people just wax poetic about the need for copyright law, especially when the creator or artist them selves are never really considered in the first place.

It's not like yee olde piracy, which can even be ethical (like videogames being unpublished and almost erased from history), but a new form whereby small companies get to join large publishers in screwing over the standalone creator - except this time it isn't by way of predatory contracts, but by sidestepping the creator and farming data from the creator to recreate the same style and form, which could've taken years - even decades to develop.

There's also this idea that "all work is derivative anyways, nothing is original", but that sidesteps the points of having worked to form a style over nigh decades and making a living off it when someone can just come along and undo all that with a press of a button.

If you're libertarian and anarchist, be honest about that. Seems like there are a ton of tech bros who are libertarian and subversive about it to feel smort (the GPL is important btw). But at the end of the day the hidden agenda is clear: someone wants to benifit from somebody else's work without paying them and find the mental and emotional justification to do so. This is bad, because they then justify taking food out of somebody's mouth, which is par for the course in the current economic system.

It's just more proof in the pudding that the capitalist system doesn't work and will always screw the labourer in some way. It's quite possible that only the most famous of artists will be making money directly off their work in the future, similarly to musicians.

As an aside, Jay-Z and Taylor Swift complaining about not getting enough money from Spotify is tone-deaf, because they know they get the bulk of that money anyways, even the money of some account that only plays the same small bands all the time, because of the payout model of Spotify. So the big ones will always, always be more "legitimate" than small artists and in that case they've probably already paid writers and such, but maybe not.. looking at you, Jay-Z.

If the copyright cases get overwritten by the letigous lot known as corporate lawyers and they manage to finger holes into legislation that benifits both IP farmers and corporate interests, by way of models that train AI to be "far enough" away from the source material, we might see a lot of people loose their livelihoods.

Make it make sense, Beehaw =(

72

You're viewing a single thread.

72 comments
  • Ah, people being scared of new technology in their field is a funny thing to watch in real time

    • Don't make fun of people being scared. Some have invested decades into honing skills that are becoming obsolete, have some empathy.

      • The skills will not be obsolete, I guarantee there will be a market for people to still do all of the drawing/digital art/whatever they do

        There will also be AI tools that they will likely need to learn or be they will be left behind by the majority, sure, but that's what happens when a new tool shakes up your industry

        Also, never made fun of anyone or didn't have empathy, I said it was funny to watch in real time as an industry shifts to new technology, so chill

        • (I still don't think that wording is right... but fine, let's chill 🥶)

          The skills will not be obsolete, I guarantee there will be a market for people

          I wouldn't be so sure.

          Up until recently, you needed a human to translate from "client-ese" to whatever "skill-ese" language was required to complete a task. With the proof of working LLMs that can do that translation, then feed their own output to themselves while interacting with other field-specific AIs and tools... and given that all human interactions and knowledge transfer happens through tokenizable language, meaning there are no skills that can not be transferred to a field-specific AI with a token based input... all anyone will eventually need is a "looping LLM system with some field-specific AIs".

          The market for people will first shrink to only those running such systems, plus those capable of teaching new tricks to the field-specific AIs. I have serious doubts about the impossibly of creating an AI system capable of coming up with new field-specific tricks, meaning those people will eventually become obsolete. It also happens that "running an AI system" is a teachable skill on itself, so those will also become obsolete.

          Basically, since the introduction of LLMs into the equation, the lim∞ market size for people, has become somewhere between 0 and "maybe 1".

You've viewed 72 comments.