Skip Navigation

Generative AI closes off a better future — Ursula Le Guin said we must be able to imagine freedom. AI traps us in the past.

www.disconnect.blog Generative AI closes off a better future

Ursula Le Guin said we must be able to imagine freedom. AI traps us in the past.

Generative AI closes off a better future

ChatGPT cannot imagine freedom or alternatives; it can only present you with plagiarized mash-ups of the data it’s been trained on. So, if generative AI tools begin to form the foundation of creative works and even more of the other writing and visualizing we do, it will further narrow the possibilities on offer to us. Just as previous waves of digital tech were used to deskill workers and defang smaller competitors, the adoption of even more AI tools has the side effect of further disempowering workers and giving management even further control over our cultural stories.

As Le Guin continued her speech, she touched on this very point. “The profit motive is often in conflict with the aims of art,” she explained. “We live in capitalism, its power seems inescapable — but then, so did the divine right of kings. Any human power can be resisted and changed by human beings. Resistance and change often begin in art. Very often in our art, the art of words.” That’s exactly why billionaires in the tech industry and beyond are so interested in further curtailing how our words can be used to help fuel that resistance, which would inevitably place them in the line of fire.

[…]

The stories and artworks that resonate with us are inspired by the life experiences of artists who made them. A computer can never capture a similar essence. Le Guin asserted that to face the challenging times ahead, we’ll need “writers who can remember freedom — poets, visionaries — realists of a larger reality.” Generative AI seems part of a wider plan by the most powerful people in the world to head that off, and to trap us in a world hurtling toward oblivion as long as they can hold onto their influence for a little longer.

As Le Guin said, creating art and producing commodities are two distinct acts. For companies, generative AI is a great way to produce even more cheap commodities to keep the cycle of capitalism going. It’s great for them, but horrible for us. It’s our responsibility to challenge the technology and the business model behind it, and to ensure we can still imagine a better tomorrow.

38

You're viewing a single thread.

38 comments
  • Once again, the "plagiarism machine" misunderstanding of how LLMs operate. It's simply not true. They don't "mash up" their training material any more than a human "mashes up" their training material. They learn patterns from their training material. Is a human author who writes yet another round of the Hero's Journey creating a "plagiarized mash-up" of past stories? When a poet writes yet another sonnet, are they just aping past poems they've seen?

    And even if it were so, complaints like these are self-contradictory. If LLM output really is just boring old retreads of stuff that went before, why are they a "threat" to skilled human authors who can produce new material? If those human authors really are inherently better than the LLMs, what's the big deal? It's not like it's a new thing for there to be content farms and run-of-the-mill stories churned out en mass. Creative authors have always had to compete with that kind of thing. Nobody's "curtailing" them, they're just competing with them. Go ahead and compete right back.

    • I generally agree on your stance regarding AI (in the end it is another tool for human artist to use), but the problem with competition as you describe is that it competes on price with the lower-entry level jobs artists might find. Thus in turn there is little opportunity for human artists left to learn on the job and reach levels surpassing AI generative art.

      While of course a lot of art is done not as a commercial endeavour, the prospect of turning it into some sort of income generation (or fame) is usually a motivating factor for artists starting out. With this motivation gone, many will turn to other professions, which in the end is likely a loss for the overall society. A good example for that are comic book artists in France Vs. Germany. In France there is a rich scene of comic book artists with regular publications, mainly because there were some commercial publishers early on and people could aspire to a "career" in being a comic book artist (with varying success...). None of that exists in Germany as far as I know, and the reason is that young people don't think it is worth learning how to draw comics, and this then becomes a chicken <-> egg problem.

      • I don't believe this is going to happen soon. I was quite worried for a while that my translation job could be killed by ChatGPT, but nothing changed that hadn't changed already a few years ago with so-called 'machine translation'. What did that mean for the writer/translator? I had to negotiate a price for a new service with agencies, for 'machine translation post-editing' - so I just made it as expensive as translation, fuck you. I'm happy to use machine translation for my work, but choose my own engine and check the work before sending it out there, because there always will be funny mistakes. AI is good enough to save me lots of typing work, but in no way good enough to be left alone to produce any text ready for publishing.

        In the case of the writers, who are now AI prompt inventors and AI text pre-publishing editors (here, have a fancy brand-new acronym: ATPE) that would mean: negotiate your price for any jobs (make sure that feeding a prompt into an AI costs the same as writing an article) so you and your family can live comfortably. Don't let them eat you alive.

        • Well, yes... but that is the momentary insider view.

          I was referring to the impression young people have when they decide what they want to invest time in to learn. These views are often highly distorted from reality, as any insider will acknowledge in retrospect, but that doesn't make them less relevant. Also young people will try to extrapolate at least a few years into the future (time it takes to finish a art degree for example) and AI will likely get better in these years.

You've viewed 38 comments.